LabKey Support Forum

Showing: limited to 100 messages
Project migration between servers-Incomplete
(1 response) WayneH 2021-06-30 07:19

Good morning,

we recently ran into an issue migrating a project between servers in our v18.2 community edition. We are preparing to migrate to a licensed current version but are really concerned about the problem we ran into.. On two different occasions one user's migration did not transfer over complete subfolder contents. I myself performed the same using the export/import tools and found that everything transferred over except for the file repository and the assay data (designs, batch/run data and tables). In my case I selected all of the options including files and all subfolder content.
The project is separated into three folders. One main folder and two subfolders within it. The assay data and file repository is in one of those subfolders. The assay data is in the top level folder although the designs/templates are in the subfolder where the file repository is located. I believe this is the structure we have.
Any thoughts on why we're having trouble migrating the project completely?

Thanks

Wayne

view message
Problem to Import FlowJo Workspace
(1 response) johann pellet 2021-06-11 07:59

Hi,

I have a problem with my dataset when I am trying to import a Flowjob Workspace in my Labkey server (LabKey Server 19.3.0).

11 Jun 2021 16:51:18,000 DEBUG: Analysis results contains 275 statistics, 76 graphs
11 Jun 2021 16:51:21,656 ERROR: FlowJo Workspace import failed
org.labkey.api.query.RuntimeValidationException: name: Value is too long for column 'name', a maximum length of 256 is allowed. Supplied value was 259 characters long.
    at org.labkey.api.data.Table.insert(Table.java:708)
    at org.labkey.flow.persist.FlowManager.ensureAttributeName(FlowManager.java:468)
    at org.labkey.flow.persist.FlowManager.ensureAttributeName(FlowManager.java:484)
    at org.labkey.flow.persist.FlowManager.ensureAttributeNameAndAliases(FlowManager.java:549)
    at org.labkey.flow.persist.FlowManager.ensureStatisticNameAndAliases(FlowManager.java:508)
    at org.labkey.flow.persist.AttributeSetHelper.ensureStatisticNames(AttributeSetHelper.java:109)
    at org.labkey.flow.persist.AttributeSetHelper.prepareForSave(AttributeSetHelper.java:89)
    at org.labkey.flow.script.WorkspaceJob.extractAnalysis(WorkspaceJob.java:371)
    at org.labkey.flow.script.WorkspaceJob.createExperimentRun(WorkspaceJob.java:170)
    at org.labkey.flow.script.WorkspaceJob.createExperimentRun(WorkspaceJob.java:148)
    at org.labkey.flow.script.AbstractExternalAnalysisJob.doRun(AbstractExternalAnalysisJob.java:233)
    at org.labkey.flow.script.WorkspaceJob.doRun(WorkspaceJob.java:138)
    at org.labkey.flow.script.FlowJob.run(FlowJob.java:76)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630)
    at java.base/java.lang.Thread.run(Thread.java:831)
Caused by: name: Value is too long for column 'name', a maximum length of 256 is allowed. Supplied value was 259 characters long.
    at org.labkey.api.query.RuntimeValidationException.<init>(RuntimeValidationException.java:36)
    ... 19 more
11 Jun 2021 16:51:21,693 INFO : Job completed at 2021-06- 16:51
11 Jun 2021 16:51:21,693 INFO : Elapsed time 40.326s```

The 'name' is too long to be inserted in the database (as I undestand).  Could you please tell me what name refers to this value.

Thank you.

Regards,
Johann
view message
Maintenance Notice - artifactory.labkey.com update for Thursday, June 10th 2021 @ 8am PDT
(1 response) Jon (LabKey DevOps) 2021-06-08 15:53

The artifactory.labkey.com server will be taken offline for maintenance on Thursday, June 10th 2021 @ 8am PDT for approximately 30 minutes max.

During this time, the site will be unavailable and the TeamCity build queue will be paused.

We apologize for any inconvenience this may cause.

view message
Maintenance Notice: Artifactory outage for Friday, June 4th 2021 @ 8am PST
(1 response) Jon (LabKey DevOps) 2021-06-03 14:42

The artifactory.labkey.com server will be taking offline for maintenance on Friday, June 4th 2021 @ 8am PST for approximately 30 minutes max.

During this time, the site will be unavailable and the TeamCity build queue will be paused.

We apologize for any inconvenience this may cause.

view message
Labkey Community Edition Consulting
(1 response) ophira 2021-06-02 01:25

Hello,

We started using Labkey Community Edition. For now, we are trying to avoid the costs of Premium editions, at least until we are certain that this solution is right for our organization.

However, we would love to consult with more experienced people regarding some design and implementation topics.
For this purpose, we are looking for someone who offers consulting services.

Are there organizations/individuals outside of Labkey who offer such a service?

Thanks,
Ophir

view message
Maintenance Notice: Migration of artifactory.labkey.com for Sunday, May 30th @ 12pm PST
(1 response) Jon (LabKey DevOps) 2021-05-27 15:18

We will be working to migrating the current artifactory.labkey.com instance to a new server starting on Sunday, May 30th @ 12pm PST

During this time, all TeamCity jobs will be paused and Artifactory will be unavailable during the migration.

The estimated downtime is approximately seven hours.

We apologize for any inconvenience this may cause.

view message
Creating a single survey that posts data to multiple lists
(1 response) caroline norris 2021-05-20 04:03

Is is possible to use the Survey feature to create a single survey that posts data to more than one list on submission?

There is no information on how to do this in the Survey documentation, so just wondering if it's possible to do so without any custom development on our part?

view message
R Reports - R markdown- knitr pandoc - The input must be a UTF-8 encoded text.
(2 responses) hilariagrieve 2021-05-19 08:19

Hi,
I have installed The last version of labkey Server in a new Server and I reload the study folder with some troubleshoots but finally I did it.
But, I have problems with R reports. I remember that I have the same problem with the old server and I solve It but I don't know how.
I was researching and tried a lot of things but I didn't make it.
The problem is with r markdown, knitr and the error is that the imput must be a UTF-8 encondec text, and The file script.Rmd is not encoded in UTF-8.
I tried to set the File : Save with Encoding UTF-8 in R studio like https://support.rstudio.com/hc/en-us/articles/200532197-Character-Encoding.

Here is the log file:

Error executing command
javax.script.ScriptException: javax.script.ScriptException: An error occurred when running the script 'script.R', exit code: 1).

processing file: script.Rmd

|
| | 0%
|
|.......... | 14%
ordinary text without R code

|
|.................... | 29%
label: labkey (with options)
List of 1
$ echo: logi FALSE

|
|.............................. | 43%
ordinary text without R code

|
|........................................ | 57%
label: setup (with options)
List of 7
$ echo : logi FALSE
$ cache : logi TRUE
$ results : chr "hide"
$ warning : logi FALSE
$ comment : logi FALSE
$ message : logi FALSE
$ comments: chr ""

Loading required package: httr
Loading required package: jsonlite
Executing: pandoc -t html -o "script.html" "script.Rmd"
[WARNING] Could not deduce format from file extension .Rmd
Defaulting to markdown
UTF-8 decoding error in script.Rmd at byte offset 995 (e9).
The input must be a UTF-8 encoded text.
Quitting from lines 24-32 (script.Rmd)
Error in (function (input, format, ext, exec, cfg) : conversion failed
Calls: render ... eval -> eval -> <Anonymous> -> mapply -> <Anonymous>
Adem�s: Warning messages:
1: In readLines(con, warn = FALSE) :
entrada inv�lida encontrada en la conexi�n de entrada 'script.Rmd'
2: In xfun::read_utf8(input) :
The file script.Rmd is not encoded in UTF-8. These lines contain invalid UTF-8 characters: 15, 46, 53, 59, 61, 72, ...
3: In read_utf8(input[1]) :
The file script.Rmd is not encoded in UTF-8. These lines contain invalid UTF-8 characters: 15, 46, 53, 59, 61, 72, ...

Ejecuci�n interrumpida

view message
Trouble Starting LabKey Server Community Edition
(4 responses) etr 2021-05-07 15:27

Hello,
I am trying to install the Labkey server but I am experiencing some difficulties.
Please keep in mind I am not a usual user of either windows or tomcat for that matter.
Running on Windows Server 2012 R2, openjdk 16, tomcat 9.0

Catalina- log says something like this:

07-May-2021 22:05:40.736 SEVERE [Catalina-utility-2] org.apache.catalina.startup.HostConfig.deployDescriptor Error deploying deployment descriptor [C:\labkey\apps\tomcat\conf\Catalina\localhost\labkey.xml]
        java.lang.IllegalStateException: Error starting child
                at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:720)
                at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:690)
                at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:706)
                at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:689)
                at org.apache.catalina.startup.HostConfig$DeployDescriptor.run(HostConfig.java:1881)
                at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
                at java.util.concurrent.FutureTask.run(Unknown Source)
                at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
                at java.util.concurrent.AbstractExecutorService.submit(Unknown Source)
                at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:582)
                at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:472)
                at org.apache.catalina.startup.HostConfig.check(HostConfig.java:1660)
                at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:315)
                at org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:123)
                at org.apache.catalina.core.ContainerBase.backgroundProcess(ContainerBase.java:1151)
                at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1353)
                at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1357)
                at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.run(ContainerBase.java:1335)
                at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
                at java.util.concurrent.FutureTask.runAndReset(Unknown Source)
                at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(Unknown Source)
                at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source)
                at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
                at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
                at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
                at java.lang.Thread.run(Unknown Source)
        Caused by: org.apache.catalina.LifecycleException: Error starting the loader
                at org.apache.catalina.loader.WebappLoader.startInternal(WebappLoader.java:432)
                at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
                at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5028)
                at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
                at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:717)
                ... 25 more
        Caused by: java.lang.UnsupportedClassVersionError: org/labkey/bootstrap/LabKeyBootstrapClassLoader has been compiled by a more recent version of the Java Runtime (class file version 58.0), this version of the Java Runtime only recognizes class file versions up to 52.0
                at java.lang.ClassLoader.defineClass1(Native Method)
                at java.lang.ClassLoader.defineClass(Unknown Source)
                at java.security.SecureClassLoader.defineClass(Unknown Source)
                at java.net.URLClassLoader.defineClass(Unknown Source)
                at java.net.URLClassLoader.access$100(Unknown Source)
                at java.net.URLClassLoader$1.run(Unknown Source)
                at java.net.URLClassLoader$1.run(Unknown Source)
                at java.security.AccessController.doPrivileged(Native Method)
                at java.net.URLClassLoader.findClass(Unknown Source)
                at java.lang.ClassLoader.loadClass(Unknown Source)
                at java.lang.ClassLoader.loadClass(Unknown Source)
                at java.lang.Class.forName0(Native Method)
                at java.lang.Class.forName(Unknown Source)
                at org.apache.catalina.loader.WebappLoader.createClassLoader(WebappLoader.java:536)
                at org.apache.catalina.loader.WebappLoader.startInternal(WebappLoader.java:408)
                ... 29 more
07-May-2021 22:05:40.738 INFO [Catalina-utility-2] org.apache.catalina.startup.HostConfig.deployDescriptor Deployment of deployment descriptor [C:\labkey\apps\tomcat\conf\Catalina\localhost\labkey.xml] has finished in [10] ms

If anyone can provide some info please.

Thanks,
ET

view message
Maintenance Notice: Migration of artifactory.labkey.com for Thursday, April 29th @ 8am PST
(1 response) Jon (LabKey DevOps) 2021-04-27 11:13

We will be working to migrating the current artifactory.labkey.com instance to a new server starting on Thursday, April 29th at 8am PST.

During this time, all TeamCity jobs will be paused and Artifactory will be unavailable during the migration.

The estimated downtime is approximately two hours.

We apologize for any inconvenience this may cause.

view message
Updates to IntelliJ config files in develop
Susan Hert 2021-04-26 11:43

I have just merged PR #66 in the LabKey/server repository to bring in some changes to the IntelliJ configuration files and to add a couple of tasks for setting up IntelliJ to use the LabKey default configurations.  These changes should mean you have to adjust fewer IntelliJ configurations manually when starting with a new enlistment and have fewer files in your "Never Commit These" change list.  The changes work best with IntelliJ Version 2021.1 or later.  Older (but not ancient) versions work as well but require more manual adjustment.

To avoid conflicts between the configuration files that this change deleted and the ones that get modified by Intellij for your local installation, please do the following before pulling down these changes:

  • Shut down IntelliJ
  • Make a copy of your .idea directory (just in case)
  • Stash the current changes in the .idea directory using git stash push -- .idea/ from the root of your LabKey enlistment
  • git pull
  • ./gradlew ijConfigure
  • Open your existing project in IntelliJ

Then:

  • Open your project in IntelliJ
  • Do a Refresh in the Gradle window.
  • If you are not using IntelliJ version 2021.1 or later
    • Edit the LabKey Dev configuration to fix the module classpath so it uses labkey-server.server.modules.platform.api.main
    • Edit the LabKey Production configuration to fix the module classpath so it uses labkey-server.server.modules.platform.api.main

If you are developing with TypeScript:

  • Go to Preferences -> Appearance & Behavior -> Path Variables and set a value for NODE_PATH that will point to the node binary used by the build (e.g., C:\Development\labkeyHome\build.node\node-v12.16.0\bin\node, or, for Mac or Linux, use the symlink created by the build that is version-independent: /Users/develop/labkeyHome/.node/node/bin/node).
  • Go to Preferences -> Languages & Frameworks -> Typescript and verify the Node interpreter settings. The interpreter should be pointing to the node version that you set as the NODE_PATH variable. If not, or if you need to change this at some point, you can do that here.

For instructions on setting up a development environment from scratch, see this page.

If you have any problems that seem related to these changes, please let us know.

Susan

view message
How to create and append to a dataset using R labkey API?
(1 response) kristen dang 2021-04-23 10:42

Hello,
I am having several errors when trying to use the R labkey API to create and load a dataset. I am able to load this same file as a dataset using the GUI, but cannot seem to do it with the R API.

When creating the design, I get an error about reserved fields. In the GUI I resolve this by matching specific upload file column names with reserved field names. I don't know how to do this in R:

fields <- labkey.domain.inferFields(baseUrl=labkey.getBaseUrl(), folderPath=fp, df=inDF)
 dd <- labkey.domain.createDesign(name=batch$batch, fields=fields)
 k = which(dd$fields$name %in% c("ParticipantId", "visitCode", "sampleName"))
 dd$fields$isPrimaryKey[k] = TRUE

    eaLabkeyList = labkey.domain.create(baseUrl=labkey.getBaseUrl(), folderPath=fp, domainKind="StudyDatasetVisit", domainDesign=dd, options=list(categoryName="SequencingTest",demographics=FALSE,useTimeKeyField=FALSE,keyPropertyName="sampleName"))

Error:

HTTP request was unsuccessful. Status code = 500, Error message = java.lang.IllegalArgumentException: Property: ParticipantId is reserved or exists in the current domain.

Once I create this file in the web UI, I'm not able to append other data to it by upload since the column names don't match. Is there some documentation for how to take care of the key name matching so I can do this in R?

Thanks for any help.

view message
How to use createAndLoad for Lists or Datasets using R?
(1 response) kristen dang 2021-04-23 10:32

Hello,
In the R labkey package, I tried to use to use the labkey.domain.createAndLoad to create and load a list, but was unable to understand the error message. I'd like to be able to use this command for List and Dataset creation. Can you please suggest how I should address the issue with "strictFieldValidation?" I included my command and the response below.

labkey.domain.createAndLoad(baseUrl = labkey.getBaseUrl(),folderPath=fp,name="sequencing_vendor_QC_metrics", df=combined, domainKind="VarList")

Error in handleError(response, haltOnError) : 
HTTP request was unsuccessful. Status code = 500, Error message = Unrecognized field "strictFieldValidation" (class org.labkey.list.model.ListDomainKindProperties), not marked as ignorable (22 known properties: "entireListTitleTemplate", "discussionSetting", "keyName", "name", "entireListIndexSetting", "titleColumn", "eachItemBodyTemplate", "fileAttachmentIndex", "allowDelete", "listId", "eachItemBodySetting", "lastIndexed", "description", "allowExport", "eachItemTitleTemplate", "domainId", "entireListIndex", "allowUpload", "keyType", "entireListBodySetting", "entireListBodyTemplate", "eachItemIndex"])

view message
How to make a link in a List to a file in the Files tab using R API?
(1 response) kristen dang 2021-04-22 17:54

Hello,
I have several Lists that include a column listing relevant files which are also loaded in the Files tab. I would like to add a link to the List that points to the file in the Files tab. There is an extensive page on how to do this with the GUI, but I am not sure how to do it with the R labkey package. Can this be done with R?

view message
Maintenance Notice: Migration of artifactory.labkey.com for Monday, April 26th @ 8am PST
(1 response) Jon (LabKey DevOps) 2021-04-22 13:20
We will be working to migrating the current artifactory.labkey.com instance to a new server starting on Monday, April 26th at 8am PST.

During this time, all TeamCity jobs will be paused and Artifactory will be unavailable during the migration.

The estimated downtime is approximately two hours.

We apologize for any inconvenience this may cause.
view message
RLabkey: Permissions when inserting/updating rows
(1 response) christian opitz 2021-04-13 00:55

I am trying to update/insert rows using the provided R library. Trying with .netrc, explicitly setting admin user/password via labkey.setDefaults() or only using the API key generated as admin results in:

Error in handleError(response, haltOnError) :
HTTP request was unsuccessful. Status code = 403, Error message = User does not have permission to perform this operation.

Everything (API key, credentials) are tight to the administrator. I can however retrieve the data using getRows() or selectRows().

The target of the operation is a targetedmslist table that was imported via Skyline.

Any help is appreciated.

Thank you.

view message
Maintenance Notice: Migration of artifactory.labkey.com for Monday, April 5th @ 11am PST
(1 response) Jon (LabKey DevOps) 2021-04-02 14:54
We will be working to migrating the current artifactory.labkey.com instance to a new server starting on Monday, April 5th at 11am PST.

During this time, all TeamCity jobs will be paused and Artifactory will be unavailable during the migration.

The estimated downtime is approximately one hour.

We apologize for any inconvenience this may cause.
view message
Problem while using the "Import and Export a XAR File" tutorial
(4 responses) ophira 2021-03-22 09:50

Hello,

I am trying to understand the Experiment module using the documentation by following these instructions:
https://www.labkey.org/Documentation/wiki-page.view?name=xarSamples

I got stuck here:

  • Download either XarTutorial.zip or XarTutorial.tar.gz and extract to your computer.
  • Select the Pipeline tab, and click Setup.
  • Click Set a pipeline override.
  • Enter the path to the directory where you extracted the files.
  • Click Save.

On step “Select the Pipeline tab, and click Setup.”, I didn’t find any “Setup” and “Set a pipeline override” anywhere.
Should this feature work in the Trial version? In the CE version?

My trial server:
https://oa1.trial.labkey.host/home/project-begin.view?

Tried the above on the "Example Project" as well as a newly created project named "proj4experiment".

Thanks,
Ophir

view message
Custom Postgres Method in Labkey Query
(2 responses) sleisle 2021-03-17 16:08

Hello Labkey,

I'm attempting to use a custom postgres method to join two tables in a file-based module query, but when running the query, it errors out stating "Method not found". I saw a similar forum post from 2013 that talked about this being unsupported, but seeing how old the post was I figured I should check if it has been updated. Is there any way to call a custom method, or is it unsupported?

Thanks!

view message
Editor permission without deleting permission.
(3 responses) Chichero 2021-03-15 11:22

Dear LabKey team,

Is there a way to allow a user to insert and update data in a dataset and a sample set without allowing for deleting this data?

Could you propose a workaround if it is not possible?

Thank you,
Natalia

view message
Upgrade Labkey
(1 response) hilariagrieve 2021-03-04 03:13

Hi, I have installed the 17.2 version of Labkey Server Community Edition. I designed and implemented a LIMS for the Forensic Genetics Laboratory. I want to upgrade to 20.11 version but I read that is not recomended to upgrade from 17.2 to 20.3 directly. Can you send me the old versions to upgrade step by step the versions of labkey? I tried to installed a new instance of labkey and import the study folder but the assay data is not imported correctly. Thanks you very much. Hilaria Grieve

view message
Module Upgrade : Error: Unable to initialize search index.
(1 response) hilariagrieve 2021-03-03 06:17

ALL SITE ERRORS returns me this log file. Can you help me?

ERROR LuceneSearchServiceImpl 2021-03-03 09:24:27,258 Module Upgrade : Error: Unable to initialize search index. Search will be disabled and new documents will not be indexed for searching until this is corrected and the server is restarted.
java.nio.file.NoSuchFileException: C:\Program Files\Apache Software Foundation\Tomcat 8.5\temp\labkey_full_text_index_4zf_Lucene50_0.tip
at sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:79)
at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:97)
at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:102)
at sun.nio.fs.WindowsFileSystemProvider.newFileChannel(WindowsFileSystemProvider.java:115)
at java.nio.channels.FileChannel.open(FileChannel.java:287)
at java.nio.channels.FileChannel.open(FileChannel.java:335)
at org.apache.lucene.store.MMapDirectory.openInput(MMapDirectory.java:238)
at org.apache.lucene.codecs.blocktree.BlockTreeTermsReader.<init>(BlockTreeTermsReader.java:176)
at org.apache.lucene.codecs.lucene50.Lucene50PostingsFormat.fieldsProducer(Lucene50PostingsFormat.java:445)
at org.apache.lucene.codecs.perfield.PerFieldPostingsFormat$FieldsReader.<init>(PerFieldPostingsFormat.java:292)
at org.apache.lucene.codecs.perfield.PerFieldPostingsFormat.fieldsProducer(PerFieldPostingsFormat.java:372)
at org.apache.lucene.index.SegmentCoreReaders.<init>(SegmentCoreReaders.java:109)
at org.apache.lucene.index.SegmentReader.<init>(SegmentReader.java:74)
at org.apache.lucene.index.ReadersAndUpdates.getReader(ReadersAndUpdates.java:145)
at org.apache.lucene.index.ReadersAndUpdates.getReadOnlyClone(ReadersAndUpdates.java:197)
at org.apache.lucene.index.StandardDirectoryReader.open(StandardDirectoryReader.java:103)
at org.apache.lucene.index.IndexWriter.getReader(IndexWriter.java:473)
at org.apache.lucene.index.DirectoryReader.open(DirectoryReader.java:103)
at org.apache.lucene.search.SearcherManager.<init>(SearcherManager.java:108)
at org.apache.lucene.search.SearcherManager.<init>(SearcherManager.java:76)
at org.labkey.search.model.WritableIndexManagerImpl.<init>(WritableIndexManagerImpl.java:106)
at org.labkey.search.model.WritableIndexManagerImpl.get(WritableIndexManagerImpl.java:86)
at org.labkey.search.model.LuceneSearchServiceImpl.initializeIndex(LuceneSearchServiceImpl.java:232)
at org.labkey.search.model.LuceneSearchServiceImpl.start(LuceneSearchServiceImpl.java:327)
at org.labkey.search.SearchModule.startBackgroundThreads(SearchModule.java:190)
at org.labkey.api.module.ModuleLoader.attemptStartBackgroundThreads(ModuleLoader.java:1309)
at org.labkey.api.module.ModuleLoader.initiateModuleStartup(ModuleLoader.java:1286)
at org.labkey.api.module.ModuleLoader.afterUpgrade(ModuleLoader.java:1451)
at org.labkey.api.module.ModuleLoader.lambda$startNonCoreUpgradeAndStartup$3(ModuleLoader.java:1433)
at java.lang.Thread.run(Thread.java:748)
ERROR StudyReload 2021-03-03 09:24:27,508 QuartzScheduler_Worker-1 : Study reload failed in folder /LIMS SGFER
org.labkey.api.admin.ImportException: Could not find file studyload.txt in the pipeline root for Lims SGFER
at org.labkey.study.importer.StudyReload$ReloadTask.attemptReload(StudyReload.java:398)
at org.labkey.study.importer.StudyReload$ReloadTask.execute(StudyReload.java:284)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)
ERROR StudyReload 2021-03-03 09:29:27,418 QuartzScheduler_Worker-2 : Study reload failed in folder /LIMS SGFER
org.labkey.api.admin.ImportException: Could not find file studyload.txt in the pipeline root for Lims SGFER
at org.labkey.study.importer.StudyReload$ReloadTask.attemptReload(StudyReload.java:398)
at org.labkey.study.importer.StudyReload$ReloadTask.execute(StudyReload.java:284)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)
ERROR StudyReload 2021-03-03 09:34:27,422 QuartzScheduler_Worker-3 : Study reload failed in folder /LIMS SGFER
org.labkey.api.admin.ImportException: Could not find file studyload.txt in the pipeline root for Lims SGFER
at org.labkey.study.importer.StudyReload$ReloadTask.attemptReload(StudyReload.java:398)
at org.labkey.study.importer.StudyReload$ReloadTask.execute(StudyReload.java:284)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)
ERROR StudyReload 2021-03-03 09:39:27,426 QuartzScheduler_Worker-5 : Study reload failed in folder /LIMS SGFER
org.labkey.api.admin.ImportException: Could not find file studyload.txt in the pipeline root for Lims SGFER
at org.labkey.study.importer.StudyReload$ReloadTask.attemptReload(StudyReload.java:398)
at org.labkey.study.importer.StudyReload$ReloadTask.execute(StudyReload.java:284)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)
ERROR StudyReload 2021-03-03 09:44:27,415 QuartzScheduler_Worker-6 : Study reload failed in folder /LIMS SGFER
org.labkey.api.admin.ImportException: Could not find file studyload.txt in the pipeline root for Lims SGFER
at org.labkey.study.importer.StudyReload$ReloadTask.attemptReload(StudyReload.java:398)
at org.labkey.study.importer.StudyReload$ReloadTask.execute(StudyReload.java:284)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)
ERROR StudyReload 2021-03-03 09:49:27,419 QuartzScheduler_Worker-7 : Study reload failed in folder /LIMS SGFER
org.labkey.api.admin.ImportException: Could not find file studyload.txt in the pipeline root for Lims SGFER
at org.labkey.study.importer.StudyReload$ReloadTask.attemptReload(StudyReload.java:398)
at org.labkey.study.importer.StudyReload$ReloadTask.execute(StudyReload.java:284)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)
ERROR StudyReload 2021-03-03 09:54:27,423 QuartzScheduler_Worker-9 : Study reload failed in folder /LIMS SGFER
org.labkey.api.admin.ImportException: Could not find file studyload.txt in the pipeline root for Lims SGFER
at org.labkey.study.importer.StudyReload$ReloadTask.attemptReload(StudyReload.java:398)
at org.labkey.study.importer.StudyReload$ReloadTask.execute(StudyReload.java:284)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)
ERROR StudyReload 2021-03-03 09:59:27,427 uartzScheduler_Worker-10 : Study reload failed in folder /LIMS SGFER
org.labkey.api.admin.ImportException: Could not find file studyload.txt in the pipeline root for Lims SGFER
at org.labkey.study.importer.StudyReload$ReloadTask.attemptReload(StudyReload.java:398)
at org.labkey.study.importer.StudyReload$ReloadTask.execute(StudyReload.java:284)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)
ERROR StudyReload 2021-03-03 10:04:27,416 QuartzScheduler_Worker-1 : Study reload failed in folder /LIMS SGFER
org.labkey.api.admin.ImportException: Could not find file studyload.txt in the pipeline root for Lims SGFER
at org.labkey.study.importer.StudyReload$ReloadTask.attemptReload(StudyReload.java:398)
at org.labkey.study.importer.StudyReload$ReloadTask.execute(StudyReload.java:284)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)
ERROR StudyReload 2021-03-03 10:09:27,420 QuartzScheduler_Worker-3 : Study reload failed in folder /LIMS SGFER
org.labkey.api.admin.ImportException: Could not find file studyload.txt in the pipeline root for Lims SGFER
at org.labkey.study.importer.StudyReload$ReloadTask.attemptReload(StudyReload.java:398)
at org.labkey.study.importer.StudyReload$ReloadTask.execute(StudyReload.java:284)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)
ERROR StudyReload 2021-03-03 10:14:27,424 QuartzScheduler_Worker-4 : Study reload failed in folder /LIMS SGFER
org.labkey.api.admin.ImportException: Could not find file studyload.txt in the pipeline root for Lims SGFER
at org.labkey.study.importer.StudyReload$ReloadTask.attemptReload(StudyReload.java:398)
at org.labkey.study.importer.StudyReload$ReloadTask.execute(StudyReload.java:284)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)
ERROR StudyReload 2021-03-03 10:19:27,428 QuartzScheduler_Worker-5 : Study reload failed in folder /LIMS SGFER
org.labkey.api.admin.ImportException: Could not find file studyload.txt in the pipeline root for Lims SGFER
at org.labkey.study.importer.StudyReload$ReloadTask.attemptReload(StudyReload.java:398)
at org.labkey.study.importer.StudyReload$ReloadTask.execute(StudyReload.java:284)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)
ERROR StudyReload 2021-03-03 10:24:27,417 QuartzScheduler_Worker-7 : Study reload failed in folder /LIMS SGFER
org.labkey.api.admin.ImportException: Could not find file studyload.txt in the pipeline root for Lims SGFER
at org.labkey.study.importer.StudyReload$ReloadTask.attemptReload(StudyReload.java:398)
at org.labkey.study.importer.StudyReload$ReloadTask.execute(StudyReload.java:284)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)
ERROR StudyReload 2021-03-03 10:29:27,421 QuartzScheduler_Worker-8 : Study reload failed in folder /LIMS SGFER
org.labkey.api.admin.ImportException: Could not find file studyload.txt in the pipeline root for Lims SGFER
at org.labkey.study.importer.StudyReload$ReloadTask.attemptReload(StudyReload.java:398)
at org.labkey.study.importer.StudyReload$ReloadTask.execute(StudyReload.java:284)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)
ERROR StudyReload 2021-03-03 10:34:27,425 QuartzScheduler_Worker-9 : Study reload failed in folder /LIMS SGFER
org.labkey.api.admin.ImportException: Could not find file studyload.txt in the pipeline root for Lims SGFER
at org.labkey.study.importer.StudyReload$ReloadTask.attemptReload(StudyReload.java:398)
at org.labkey.study.importer.StudyReload$ReloadTask.execute(StudyReload.java:284)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)
ERROR StudyReload 2021-03-03 10:39:27,429 QuartzScheduler_Worker-1 : Study reload failed in folder /LIMS SGFER
org.labkey.api.admin.ImportException: Could not find file studyload.txt in the pipeline root for Lims SGFER
at org.labkey.study.importer.StudyReload$ReloadTask.attemptReload(StudyReload.java:398)
at org.labkey.study.importer.StudyReload$ReloadTask.execute(StudyReload.java:284)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)
ERROR StudyReload 2021-03-03 10:44:27,418 QuartzScheduler_Worker-2 : Study reload failed in folder /LIMS SGFER
org.labkey.api.admin.ImportException: Could not find file studyload.txt in the pipeline root for Lims SGFER
at org.labkey.study.importer.StudyReload$ReloadTask.attemptReload(StudyReload.java:398)
at org.labkey.study.importer.StudyReload$ReloadTask.execute(StudyReload.java:284)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)

view message
Removing server/modules/build.gradle file
(1 response) Susan Hert 2021-02-05 11:44

In order to do away with some flawed logic related to applying the LabKey gradle plugins, I am planning to remove the server/modules/build.gradle file early next week in our develop branches, in time for LabKey Server version 21.3.0. If you are maintaining your own local modules, you may need to add or update your build.gradle files in order to apply the plugin that had been being applied from this file. See this page for more details.

After this file is removed, and we update to the 1.25.0 version of the LabKey Gradle plugins, if the Gradle plugin detects a module that has a module.properties file but does not have a module task, a warning like the following will be shown at the end of running the deployApp command:

The following projects have a 'module.properties' file but no 'module' task. These modules will not be included in the deployed server. You should apply either the 'org.labkey.build.fileModule' or 'org.labkey.build.module' plugin in each project's 'build.gradle' file.
        :server:modules:myModule
        :server:modules:otherModule

You can fix this by applying the appropriate plugin as suggested or by defining your own module task that creates the .module file needed for inclusion in your LabKey Server instance.

Also, if you previously had a build.gradle file that references configurations, tasks, etc. defined in the Gradle plugins, you may encounter errors during the configuration phase when running a Gradle task like the following.

FAILURE: Build failed with an exception.

* Where:
Build file '/path/to/Development/labkey/root/server/modules/myModule/build.gradle' line: 6

* What went wrong:
A problem occurred evaluating project ':server:modules:myModule'.
> Could not find method implementation() for arguments com.sun.mail:jakarta.mail:1.6.5 on object of type org.gradle.api.internal.artifacts.dsl.dependencies.DefaultDependencyHandler.

This is fixed in the same way, by applying the plugin that had been being applied in server/modules/build.gradle.

If you have questions or encounter problems, please let us know.

Susan

view message
Rlabkey webdav access failing with 401 unauthorized
(3 responses) Marcel de Leeuw 2021-01-31 06:31

Hello,

We are unsuccessful acessing files on the labkey server through Rlabkey. Selectrows in a folder succeeds, bu then acessing files through webdav in the same folder for the same user (API key) fails with error code 401 (unauthorized). Aslo, webdav through the browser works fine. What needs to be configured for webdav through Rlabkey?

baseurl <- "http://bifx.mrmhealth.com:8080/labkey"
folderpath <- "/MRM Health/M002/Pre-clinical/E-00065 Rat"
setwd("~/labkey/M002/Pre-clinical")

library(Rlabkey)

labkey.mgx <- labkey.selectRows(
  baseUrl = baseurl,
  folderPath=folderpath, 
  schemaName="assay.General.Whole metagenome", 
  queryName="Runs", 
  viewName="", 
  colNameOpt="rname")
datafile <- gsub(".+@files/(.+)", "\\1", labkey.mgx$realtiveabundances)

labkey.webdav.get(
  baseUrl=baseurl,
  folderPath=folderpath,
  remoteFilePath=datafile,
  localFilePath="rat.mgx.tsv",
  overwrite=T)

Error in handleError(response, haltOnError) :
HTTP request was unsuccessful. Status code = 401, Error message = /_webdav/MRM Health/M002/Pre-clinical/E-00065 Rat/@files/assaydata/ca-mrm-bacteria-filtered-ra_2020_12_04_20_04.tsv

Thanks and regards

view message
Troubleshotting Reloading Study in a new server with a new version of Labkey Server
(2 responses) hilariagrieve 2021-01-26 08:03

Hi!
I have problems reloading the Study in a new version of Labkey Server in another Server.
I have instaled in one server Labkey v17 with a Study Proyect data.
Now we have e new server, and I installed the new version of labkey v20.7. I tried to creat the same Study Proyect exporting the Study Proyect from the old server and importing this export in the new server.
Everything was fine untill the pipeline got an error.

This is the log file:
26 ene. 2021 11:55:33,943 INFO : Starting to run task 'org.labkey.study.importer.StudyImportInitialTask' at location 'webserver'
26 ene. 2021 11:55:36,082 INFO : Reloading study from LIMS SGFER_2021-01-26_10-44-53.folder.zip
26 ene. 2021 11:55:36,222 INFO : Loading top-level study properties (label, start/end date, description, etc.)
26 ene. 2021 11:55:40,028 INFO : Done importing top-level study properties (label, start/end date, description, etc.)
26 ene. 2021 11:55:49,286 INFO : Loading QC states
26 ene. 2021 11:55:50,341 INFO : Done importing QC states
26 ene. 2021 11:55:50,349 INFO : Loading treatment data tables
26 ene. 2021 11:55:51,090 INFO : Done importing treatment data tables
26 ene. 2021 11:55:51,138 INFO : Loading assay schedule tables
26 ene. 2021 11:55:51,620 INFO : Done importing assay schedule tables
26 ene. 2021 11:55:51,780 INFO : Loading datasets manifest from datasets_manifest.xml
26 ene. 2021 11:55:51,900 INFO : Loading dataset schema from datasets_metadata.xml
26 ene. 2021 11:55:56,663 INFO : Successfully completed task 'org.labkey.study.importer.StudyImportInitialTask'
26 ene. 2021 11:55:56,910 INFO : Starting to run task 'org.labkey.study.pipeline.StudyImportDatasetTask' at location 'webserver'
26 ene. 2021 11:55:57,055 INFO : Start batch Lims SGFER.dataset
26 ene. 2021 11:55:57,079 INFO : Causas: Starting delete
26 ene. 2021 11:55:57,098 INFO : Causas: Deleted 5075 rows
26 ene. 2021 11:55:57,130 INFO : Causas: Starting import from dataset5001,tsv
26 ene. 2021 11:56:14,049 DEBUG: commit complete
26 ene. 2021 11:56:14,084 INFO : Causas: Successfully imported 5091 rows from dataset5001,tsv
26 ene. 2021 11:56:14,126 INFO : Secuenciacion ABI: Starting delete
26 ene. 2021 11:56:14,128 INFO : Secuenciacion ABI: Deleted 0 rows
26 ene. 2021 11:56:14,147 INFO : Secuenciacion ABI: Starting import from dataset5003,tsv
26 ene. 2021 11:56:15,202 ERROR: dataset5003,tsv -- Two columns mapped to target column: CausasId
26 ene. 2021 11:56:15,339 INFO : Extracción de ADN: Starting delete
26 ene. 2021 11:56:15,341 INFO : Extracción de ADN: Deleted 0 rows
26 ene. 2021 11:56:15,344 INFO : Extracción de ADN: Starting import from dataset5004,tsv
26 ene. 2021 11:56:15,716 ERROR: dataset5004,tsv -- Two columns mapped to target column: CausasId
26 ene. 2021 11:56:15,737 INFO : Cuantificación Fluorimétrica: Starting delete
26 ene. 2021 11:56:15,739 INFO : Cuantificación Fluorimétrica: Deleted 0 rows
26 ene. 2021 11:56:15,741 INFO : Cuantificación Fluorimétrica: Starting import from dataset5005,tsv
26 ene. 2021 11:56:15,899 ERROR: dataset5005,tsv -- Two columns mapped to target column: CausasId
26 ene. 2021 11:56:15,921 INFO : Cuantificación Real Time: Starting delete
26 ene. 2021 11:56:15,923 INFO : Cuantificación Real Time: Deleted 0 rows
26 ene. 2021 11:56:15,926 INFO : Cuantificación Real Time: Starting import from dataset5006,tsv
26 ene. 2021 11:56:16,017 ERROR: dataset5006,tsv -- Two columns mapped to target column: CausasId
26 ene. 2021 11:56:16,039 INFO : Amplificación por PCR: Starting delete
26 ene. 2021 11:56:16,041 INFO : Amplificación por PCR: Deleted 0 rows
26 ene. 2021 11:56:16,044 INFO : Amplificación por PCR: Starting import from dataset5007,tsv
26 ene. 2021 11:56:16,538 ERROR: dataset5007,tsv -- Two columns mapped to target column: CausasId
26 ene. 2021 11:56:16,545 INFO : Finish batch Lims SGFER.dataset
26 ene. 2021 11:56:16,549 INFO : Updating participant visits
26 ene. 2021 11:56:16,566 INFO : Updating participants
26 ene. 2021 11:56:16,749 INFO : Updating participant visit table
26 ene. 2021 11:56:16,750 INFO : Updating visit table
26 ene. 2021 11:56:16,752 INFO : Updating cohorts
26 ene. 2021 11:56:16,754 INFO : Clearing participant visit caches
26 ene. 2021 11:56:16,757 INFO : Finished updating participants
26 ene. 2021 11:56:16,807 INFO : Successfully completed task 'org.labkey.study.pipeline.StudyImportDatasetTask'

I attached a dataset file example that got that error, and this have 2 columns names that mapped to target column CausasId, "CaudasId" and "ParticipantId".

Please, can you help me to resolve this problem?
Everything is reloaded in the new study but the datasets data.
Thank you very much!
Hilaria

view message
GUI not fully loading
mitch blocher 2021-01-25 13:37

Ive noticed that occasionally The GUI will not display certain information such as scripts, Pop up boxed etc. I have ensured popups and JavaScript were enabled in my chrome browser, and cookies and Cache were deleted. All labkey servers worked fine except for one. The solution I found was to reset all settings in my chrome browser, however the specific culprit causing the issue was not found.

view message
Scheduled Maintenance: artifactory.labkey.com Friday January 22, 2021 @ 7 AM PDT
(1 response) Jon (LabKey DevOps) 2021-01-20 13:01
We will be performing scheduled maintenance on LabKey's artifactory server ( https://artifactory.labkey.com/ ) on Friday January 22, 2021 @ 7 AM PDT.

The expected downtime should be approximately 60-90 minutes.

This will affect build artifacts during this maintenance period.

While Artifactory is unavailable, you should use the --offline flag for gradle when building.

We are performing this maintenance to install the latest security patches for Artifactory and its underlying operating system.
view message
migration from standard server to docker and authentication
(1 response) ramez saour 2021-01-18 04:06

I have migrated labkey and tomcat7 config from standard host to docker image tomcat8 and same config as labkey, however, since moved to docker I cant authenticate to labkey portal. is there way to use local account (admin) or configure ldap connection inside docker?

I have also
<Resource name="ldap/ConfigFactory" auth="Container"
type="org.labkey.premium.ldap.LdapConnectionConfigFactory"
factory="org.labkey.premium.ldap.LdapConnectionConfigFactory"
host="10.5.0.9"
port="389"
principal="CN=zSvcADBindRDLK,OU=ServiceAccounts,OU=GEL,DC=corp,DC=gel,DC=ac"
credentials="***********"
/>

view message
XML counter columns
(1 response) katy wiseman 2021-01-14 01:43

Hi,

I'm trying to get an XML counter column to work as described here at the bottom of the page:
https://www.labkey.org/Documentation/Archive/20.7/wiki-page.view?name=sampleIDs

However it's not working. I've created the sample type as it's shown in the example and pasted the XML code into the XML metadata but it's not generating an automatic 'SampleInLot' number for each new 'Lot' in the table.

This is the XML metadata suggested:

<tables xmlns="http://labkey.org/data/xml">
<table tableName="MySampType" tableDbType="NOT_IN_DB">
<javaCustomizer class="org.labkey.experiment.api.CountOfUniqueValueTableCustomizer">
<properties>
<property name="counterName">
SampleCounter
</property>
<property name="counterType">
org.labkey.api.data.UniqueValueCounterDefinition
</property>
<!-- one or more pairedColumns used to derive the unique value -->
<property name="pairedColumn">
lot
</property>
<!-- one or more attachedColumns where the incrementing counter value is placed -->
<property name="attachedColumn">
sampleInLot
</property>
</properties>
</javaCustomizer>
</table>
</tables>

Does anyone have any thoughts about how to get this to work? Or any other alternatives?

Thanks,

Katy

view message
Security Vulnerability Notice for LabKey users on PostgreSQL - PgMiner botnet
Jon (LabKey DevOps) 2020-12-15 18:09

Due to the PgMiner botnet exploit/vulnerability (https://www.zdnet.com/article/pgminer-botnet-attacks-weakly-secured-postgresql-databases), we recommend any LabKey user that has a self-hosted environment to read the article and take the following precautions:

  • If you still have the default postgres user that came with your PostgreSQL installation, please do the following:
  1. Create a replacement superuser on your database.
  2. After creating the replacement user, delete the default postgres user.
  • Confirm that your PostgreSQL database can only be accessed by your LabKey server on your specific designated port. By default, PostgreSQL uses port 5432. Consider either changing to a different port or update your security settings (i.e. Firewall, pg_hba.conf file) so only your LabKey instance can access your PostgreSQL server.

For more information on securing your PostgreSQL server, please checkout the following resource:

https://www.enterprisedb.com/blog/how-to-secure-postgresql-security-hardening-best-practices-checklist-tips-encryption-authentication-vulnerabilities

view message
Error: Search is disabled because the search index is misconfigured. Contact the system administrator of this server.
(1 response) hilariagrieve 2020-12-04 05:05

Hi!
I have this problem, how Can I fixit? thanks!

org.apache.lucene.store.LockObtainFailedException: Lock held by this virtual machine: C:\Program Files\Apache Software Foundation\Tomcat 8.5\temp\labkey_full_text_index\write.lock
at org.apache.lucene.store.NativeFSLockFactory.obtainFSLock(NativeFSLockFactory.java:127)
at org.apache.lucene.store.FSLockFactory.obtainLock(FSLockFactory.java:41)
at org.apache.lucene.store.BaseDirectory.obtainLock(BaseDirectory.java:45)
at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:804)
at org.labkey.search.model.WritableIndexManagerImpl.get(WritableIndexManagerImpl.java:78)
at org.labkey.search.model.LuceneSearchServiceImpl.initializeIndex(LuceneSearchServiceImpl.java:232)
at org.labkey.search.model.LuceneSearchServiceImpl.resetIndex(LuceneSearchServiceImpl.java:343)
at org.labkey.search.SearchController$AdminAction.handlePost(SearchController.java:319)
at org.labkey.search.SearchController$AdminAction.handlePost(SearchController.java:221)
at org.labkey.api.action.FormViewAction.handleRequest(FormViewAction.java:101)
at org.labkey.api.action.FormViewAction.handleRequest(FormViewAction.java:80)
at org.labkey.api.action.BaseViewAction.handleRequest(BaseViewAction.java:177)
at org.labkey.api.action.SpringActionController.handleRequest(SpringActionController.java:416)
at org.labkey.api.module.DefaultModule.dispatch(DefaultModule.java:1226)
at org.labkey.api.view.ViewServlet._service(ViewServlet.java:205)
at org.labkey.api.view.ViewServlet.service(ViewServlet.java:132)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:742)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:231)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
at org.labkey.api.data.TransactionFilter.doFilter(TransactionFilter.java:38)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
at org.labkey.core.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:118)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
at org.labkey.api.module.ModuleLoader.doFilter(ModuleLoader.java:1147)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
at org.labkey.api.security.AuthFilter.doFilter(AuthFilter.java:217)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:198)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:96)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:478)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:80)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:624)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:342)
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:799)
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:868)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1455)
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:748)

view message
Apologies for inadvertent notification email "RE: Table view permissions"
chetc (LabKey Support) 2020-12-03 10:17

On 12/2, you were inadvertently sent a message from LabKey Support with the subject "RE: Table view permissions." The default email setting for our support forum was set too broadly; we have corrected the problem. We sincerely apologize for the inconvenience.

view message
LabKey React npm package available - @labkey/components
cnathe 2020-11-17 06:30

Hello All,
For any LabKey module developers out there who are working on creating LabKey pages or applications using React, we would like to let you know that the first public version of the @labkey/components npm package has been released and is available for use within your modules. This package contains some React based components, models, actions, and utility functions that will help with data access and display of LabKey schemas/tables in your React application.

You can get further information about the available components, how to install the package locally, and see some example usages from the Public API doc page.

Thanks,
Cory Nathe

view message
Table view permissions
(3 responses) sjvanro 2020-11-12 11:41

When impersonating a reader for my tables, I'm able to see data within the table. However, users within a group under reader permissions are not able to see data in the tables. How do I resolve this?

view message
wiki bug
(1 response) Stefan Nicolet 2020-11-10 03:59

A long JSON message (attached) arrises when people edit (add, remove, modify) elements of a list that has been integrated into a wiki page the following way:

<h4>People in the network</h4><br>
<div id="list_div"></div>
<script>
var wp = new LABKEY.WebPart({ renderTo: 'list_div', partName: 'List - Single', partConfig: { listName: 'medict_network'}, frame: 'none'});
wp.render();
</script>

This bug does not happen if the edition is done on the list itself.

 displayed_message.txt 
view message
Some code issues found by Static Code Analyzer that may need reviewing
(1 response) stevec 2020-11-09 10:17

I ran a static Code Analyzer on the Labkey code and found the following issues that may need reviewing:

========================================================================================

  1. The String equals method is called with argument org.labkey.api.util.Path, not a String. The equals method will always return false.

2 examples, please note the getPath() method returns a org.labkey.api.util.Path, not a String.

Example 1

org.labkey.api.webdav.AbstractWebdavResource
                                .......
                                .......
                                public boolean canRead(User user, boolean forRead)
                                {
                                if ("/".equals(getPath()))

=====================================================================================
Example 2

                    org.labkey.api.webdav.WebFilesResolverImpl
                                .......
                                .......
                                 public boolean canRead(User user, boolean forRead)
                                 {
                                 if ("/".equals(getPath()))

~========================================================================================

  1. The org.labkey.api.util.Path equals method is called and passes a String, the org.labkey.api.util.Path.equals method will return false when the input Object is not org.labkey.api.util.Path,

File: ./server/modules/platform/core/src/org/labkey/core/webdav/DavController.java

             public void writeProperties(WebdavResource resource, Find type, List<String> propertiesVector)

                        ......
                        ......
                        .......

                  String displayName = resource.getPath().equals("/") ? "/" : resource.getName();

========================================================================================


                       public void writeProperties(WebdavResource resource, Find type, List<String> propertiesVector) throws Exception
                        {
                        json.object();
                        json.key("id").value(resource.getPath());
                        String displayName = resource.getPath().equals("/") ? "/" : resource.getName();


========================================================================================
~

  1. The HashMap.get(Object) call passes in object ColumnInfo which is not the key defined for the HashMap. The HashMap key is a String.

     Map<String, Integer> outputMap = DataIteratorUtil.createColumnAndPropertyMap(it);       // String is the key to this HashMap

        ColumnInfo containerColumn = table.getColumn("container");
 

        Integer indexContainer = outputMap.get(containerColumn);       // containerColumn is not a String

=====================================================================================

~
4. NullPointerExceptions

variable raf will get NullPointerException

class: org.labkey.core.webdav.DavbController$PutAction

WebdavStatus doMethod() throws DavException, IOException, RedirectException {

....
.....
....

           RandomAccessFile raf = null;
            OutputStream os = null;

            try
            {
                if (!exists)
                {
                    temp = getTemporary();
                   if (temp)
                        markTempFile(resource);
                    deleteFileOnFail = true;
                }

                File file = resource.getFile();
                boolean isBrowserDev = getUser().isTrustedBrowserDev();
                if (range != null)
                {
                    if (resource.getContentType().startsWith("text/html") && !isBrowserDev)
                        throw new DavException(WebdavStatus.SC_FORBIDDEN, "Partial writing of html files is not allowed");
                    if (null != AntiVirusService.get())
                        throw new DavException(WebdavStatus.SC_FORBIDDEN, "Partial writing not supported with virus scanner enabled");
                    if (range.start > raf.length() || (range.end - range.start) > Integer.MAX_VALUE)

=====================================================================================

variable context will get NullPointerException

org.labkey.api.reports.report.ScriptReportDescriptor

  public ReportDescriptorDocument getDescriptorDocument(ImportContext context)
    {
        // if we are doing folder export (or module file save), we don't want to double save the script property
        if (null != context)
            return getDescriptorDocument(context.getContainer(), context, true, Set.of(Prop.script.name()));
        else
            return getDescriptorDocument(context.getContainer(), context, true, Collections.emptySet());
    }

=====================================================================================

  1. To honor if statement, remove Semicolon at:

if (BACKGROUND_WELL_GROUP.equals(group.getName()));

class: org.labkey.elispot.ElispotPlateTypeHandler

   {
        Map<String, Map<String, Double>> backgroundMap = new HashMap<>();
        WellGroup backgroundGroup = null;

        for (WellGroup group : plate.getWellGroups(WellGroup.Type.CONTROL))
        {
            if (BACKGROUND_WELL_GROUP.equals(group.getName()));
            {
                backgroundGroup = group;
                break;
            }
        }

view message
automatically defining all dataset fields with a specific number format
(1 response) karenchait841 2020-11-05 01:27

Hi,
In study module when I create a new dataset how can I define all fields with type decimal to have a specific number format? I have a very long list of fields and it impossible to do it manually for all of them. I tried to define the number format of the study folder but it did not solve this issue.
Thanks,
Karen

view message
Setting expression matrix assay
(1 response) karenchait841 2020-11-05 01:19

Hello, I am trying to set up Labkey as our labs database. Some of the projects include expression matrix data but the assay format is not as I expected. If I understand correctly in the Samples Type the columns are the samples so it is not an annotation file for samples, and the Assay List is the expression matrix but in a list format and not a matrix. Is there a different way to do it? For now I uploaded the data to a subfolder defined as study, but the genes annotations can't be integrated with the data.
Thanks for any advice,
Karen

view message
TeamCity Trunk project rename
Trey 2020-10-27 13:41

As announced previously, we have finished migrating our source code from SVN to Git.
To reflect this, we are updating the "Trunk" project name on TeamCity. Henceforth, it will be labeled "Develop" to match our Git branch naming.

Note: In order to maintain backward compatibility with existing bookmarks and scripts, IDs will not be changed at this time.

view message
Source Code migration to Git
ians 2020-10-21 14:41

Hi All,

For 20.11 we have finished migrating our source code to git. Prior releases will remain in SVN and continue to be maintained based upon our Release and Upgrade policy. Changes and updates to to the source code for 20.11 and beyond will only be performed in git. If you are building from SVN trunk source please migrate at your earliest convenience.

Migration instructions:

For this example these instructions the original enlistment is in ./trunk/:

  1. Clone the new repo's develop branch to a sibling directory of your current enlistment
    git clone https://github.com/LabKey/server labkeyServer
  2. Then reenlist or copy over the server/modules directory to the corresponding new location
    cp -r trunk/server/modules labkeyServer/server/
  3. Reenlist or Copy over any additional enlistments (./remoteapi/*, ./server/testAutomation, etc.)
  4. Copy over any other changes -- specifically ./settings.gradle, ./server/configs/pg.properties, ./server/configs/mssql.properties, etc.
    cp trunk/server/configs/pg.properties labkeyServer/server/configs/pg.properties
  5. cd into the new directory
  6. execute ./gradlew cleanNodeModules from the new enlistment's root --- our initial testers ran into some conflicts and errors within these directories
  7. execute ./gradlew pickPg or ./gradlew pickMSSQL
  8. The new enlistment should then build as normal.
  9. (optional) If you are using IntelliJ, you will need to update the various project settings as you would for any new enlistment. Copying over the .idea directory can retain these, but it may also carryover some project specific settings that don't apply in the new location.

Ian

view message
list table quadruples when upgrading labkey from v17 to v19.1
(4 responses) qing chang 2020-10-06 07:19
We have recently upgraded one of our labkey servers from v17 to v20.7. We had to first upgrade to v19.1 before going to 20.7. We use postgresql 9.6 as backend.

We noticed there was a big jump in storage usage after upgraded to 19.1. A closer look revealed that ALL list type tables have quadrupled in storage usage without any changes in contents. One of the table was 10G, it became 40G after upgrade.
-------
 list.c185d2069_modc_test | 10 GB
 list.c185d2069_modc_test | 40 GB
-------

The row count is 50 mil before and after upgrade. The first 10 rows are identical. I have no reason to believe there is any difference in other rows.

Can someone shed some light on this?

Thanks in advance.

Qing Chang
view message
Questions about the study module
(1 response) eva pujadas 2020-09-25 09:06

Dear LabKey supporters,

We are using LabKey Server 19.3.7.

  1. Is it possible to have the PrticipantId filed to be of type integer? By default it is text, and it does not sort properly when the IDs are numbers. See attached screenshot "participant_id_sorting.png").

  2. When inserting a new record from the Participant view of a particular participant ID (see screenshot "insert_demographic_record_1.png"), shouldn't the form field for the participant ID be already populated with the ID? (see screenshot "insert_demographic_record_2.png").
    It seems this to be expected behavior, also because the URL to the insert form contains the participant ID as parameter. E.g. : ....dataset-insert.view?datasetId=5019&quf_ParticipantId=455585

Thank you and regards,
Eva

 insert_demographic_record_2.png  insert_demographic_record_1.png  participant_id_sorting.png 
view message
LabKey database not populated on startup
(3 responses) james gregoric 2020-09-11 10:18

2020-09-11 Update: When we go to the Tomcat Web Application Manager page and click on the labkey link the following error appears:
java.lang.NullPointerException at
org.labkey.api.module.ModuleLoader.upgradeCoreModule(ModuleLoader.java:1332) at org.labkey.api.module.ModuleLoader.doInit(ModuleLoader.java:499) at org.labkey.api.module.ModuleLoader.init(ModuleLoader.java:239)

Looking at our PostGres database we see that the LabKey tables have been created, but all tables are empty. So for some reason the tables are not being populated even though we clearly have write access to the database (since the tables were created by the startup process).

As noted in the error dump, the error occurs at line 1332 in ModuleLoader.java. (See ..\server\modules\platform\api\src\org\labkey\api\module). The code there reads:

1325 // If modules table doesn't exist (bootstrap case), then new up a core context
1326 if (getTableInfoModules().getTableType() == DatabaseTableType.NOT_IN_DB)
1327 coreContext = new ModuleContext(coreModule);
1328 else
1329 coreContext = getModuleContext("Core");
1330
1331 // Does the core module need to be upgraded?
1332 if (!coreContext.needsUpgrade(coreModule.getSchemaVersion()))
1333 return false;

Evidently, the call to ModuleContext() on line 1327 is returning a null value for coreContext. We assume that is because the database is not populated.

view message
Lookups/Links to primary key/normal values in SQL queries
(5 responses) max diesner 2020-08-24 02:36

Hi,

we have no for some time a community edition labkey server running in the lab and it is just awesome. I have recently setup some SQL queries for keeping track of our Lab inventory which is also working quite nicely. However there is a minor thing that keeps bugging me. The resulting query table does not copy automatically the lookup/ Text link for a certain value from the underlying table (list). I checked the documentation and it states that you can easily get this done by applying the lookup through the Metadata properties. Thus, i accordingly changed the column to a lookup from the appropriate underlying column. I also checked the Metadata code and can see that the appropriate <fk></fk> part has been added to the metadata code. Nevertheless the resulting table still does not show the appropriate text links. Do you need to check some option that i am missing or does the LEFT JOIN statement in the SQL code overrides any potential lookups for certain values? I am trying to connenct primary keys for your information. When i am referencing from a table where the value is already a lookup it is copying the textlink right away.

I am looking forward to hear from you!

Cheers Max

view message
Labkey/Panorama server not recognized in Skyline
(3 responses) mlane 2020-08-13 14:42

Hi,

We just set up Community edition Labkey to test Panorama with Skyline. I was able to access it directly through browser, log-in and create project folders and subfolders setup as Panorama.

However, Skyline don't connect to the server and gives error that reads
that our server url is not a Panorama server. I'm wondering if it's a setting for the site, or permission? See attached for error message.

Thanks,

Monica Lane

 Labkey_error_08132020.pdf 
view message
Date error msg: "couldn’t convert date field, should be of type timestamp"
(6 responses) panthea tzourio 2020-08-10 03:46

Hello support, our Labkey server is situated in France with CET time zone (Central Europe Time). We encounter a persistent CEST (Central Europe Summer Time) exception on date fields each time we try to import datasets in a Study type folder (time and participant are mandatory fields). Even the simple and structured excel files David Hansons uses for his demos produce the same error : “couldn’t convert date field, should be of type timestamp” while the field is already of type timestamp. The result is the same on both V18.3 and V20.3 (pre prod) Labkey installs. Please find attached, the Look and feel setting with the default date format to which I added a "z" to force the system date; the CEST error message, an example of an Excel file generating the exception. Only the “Demographics” file doesn't generate the error eventhough the date is in the same format. I assume there’s no such control on date field in demographics files. I deleted in Study the default CEST date format in case this caused the error but the result is the same. This very first step error prevents us to go further on our Labkey investigation tests.
Thanks in advance for your time.

 Look and feel.PNG  Error msg CEST.PNG  LabResults.xls 
view message
Survey Design - Card Layout - Section skip logic
(1 response) rita alves 2020-08-07 07:39

Hello,

In case i am doing a survey design with a card layout. Is it possible to hide section based on an answer to a question in a different section?

Thank you in advance!

view message
R dataframe from external data source to data grid
(1 response) dhutchison 2020-08-06 18:21

Hi,

I have an R script that queries an external data source (through jdbc, not labkey.xml) and would like to display the resulting dataframe as a data grid in labkey. I want the data to be dynamically loaded - that is, the query runs each time the report is viewed. I'm thinking this should be straightforward, but not quite seeing how to do it.

For example:

result<- dbGetQuery(con, sql)
# ${htmlout:csvfile}
write(unlist(result), file="csvfile")

will output the result text in the report tab, but not as a grid.

If I don't "unlist", I get an error:

Error in cat(x, file = file, sep = c(rep.int(sep, ncolumns - 1), "\n"), :
argument 1 (type 'list') cannot be handled by 'cat'"

because write expects an atomic list, I guess.

Any thoughts on how to wrangle the dataframe into a Data Grid?

Thanks

view message
Mascot Server issue
david lee 2020-07-10 11:45

Hi.
A new user of Labkey here. We have just set up our Dev instance.
We are trying to connect to our internal Mascot server to run MS2 pipelines.
On the Admin config page we test connection and everything work... the report says "Test Passed"
I go to a project and try to set up a search... but get this error:
There was a problem retrieving the database list from the server: Failed to interact with the Mascot server.

Any help would be welcome.

Thanks

David Lee

view message
Trigger process after successful specimen import
(4 responses) tstellin 2020-06-29 12:23

Hi,

We'd like to kick off a post-specimen import process that starts when labkey finishes a specimen import pipeline job. Does a feature like this exist in Labkey Server? I've looked through the UI and documentation, and didn't see anything, but wanted to double-check and make sure it doesn't exist.

Thanks!
-Tobin

view message
Generating a folder access report
(1 response) jgane 2020-06-25 12:28

Hi,

I was wondering if there is a way to generate a list of all users that have access to a folder for every folder/subfolder on LabKey? Essentially we want to collect the /folderAccess.view page for every folder into something like a JSON.

We tried using the R API to pull a list of all the folders on LabKey and then querying the core.Users table for each folder, however we ran into an issue where if a user is a part of a project group, they will show up in the core.Users table for every subfolder regardless of whether they actually have access to that subfolder or not. Even if that project group is not added to any folders and has no ability to access anything in that project, the users in that group will still show up in the core.Users table for every subfolder.

In case it helps we are running on version 20.3.

Any suggestions or ideas would be appreciated.

Thanks,
Jon

view message
Enabling CORS and file deletion issues
(7 responses) inaki martinez 2020-06-24 08:35

Dear Labkey community

we are having some issues while enabling CORS in labkey for some external collaborators. Our labkey instances work fine without CORS but as soon as we enable it, some labkey functionality does not work. As far as we could test, only file deletion stops working, but still is a problem.

Our labkey server has the following versions:

  • apache-tomcat: 8.5.53
  • openjdk:13.0.2
  • LabKey19.3.10-65330.20

The CORS configuration is as follows, although I've tried using just the default values without any luck.

     <filter>
        <filter-name>CorsFilter</filter-name>
        <filter-class>org.apache.catalina.filters.CorsFilter</filter-class>

        <init-param>
          <param-name>cors.allowed.origins</param-name>
          <param-value>https://bc2-labkey-dev.bc2.unibas.ch, https://bc2-labkey-dev.bc2.unibas.ch, https://labkey-dev.scicore.unibas.ch, https://labkey-dev.scicore.unibas.ch, https://wiki.biozentrum.unibas.ch, https://labkey.scicore.unibas.ch</param-value>
        </init-param>

        <init-param>
          <param-name>cors.allowed.methods</param-name>
          <param-value>GET,POST,OPTIONS</param-value>
        </init-param>

        <init-param>
           <param-name>cors.allowed.headers</param-name>
           <param-value>Access-Control-Expose-Headers,Access-Control-Allow-Origin,X-Requested-With,Content-type,Authorization</param-value>
        </init-param>

        <init-param>
          <param-name>cors.exposed.headers</param-name>
          <param-value>Access-Control-Expose-Headers,Access-Control-Allow-Origin,X-Requested-With,Content-type,Authorization</param-value>
        </init-param>

        <init-param>
          <param-name>cors.support.credentials</param-name>
          <param-value>true</param-value>
        </init-param>

        <init-param>
          <param-name>cors.preflight.maxage</param-name>
          <param-value>1800</param-value>
        </init-param>
      </filter>

      <filter-mapping>
        <filter-name>CorsFilter</filter-name>
        <url-pattern>/*</url-pattern>
      </filter-mapping>

Even with a simpler CORS setup it does not work:

     <filter>
        <filter-name>CorsFilter</filter-name>
        <filter-class>org.apache.catalina.filters.CorsFilter</filter-class>

        <init-param>
          <param-name>cors.allowed.origins</param-name>
          <param-value>*</param-value>
        </init-param>

      <filter-mapping>
        <filter-name>CorsFilter</filter-name>
        <url-pattern>/*</url-pattern>
      </filter-mapping>

Upon trying to delete a file I have just uploaded in the files tab, I get "Failed to delete" message. Firefox devel tools show:

Request URL:https://bc2-labkey-dev.bc2.unibas.ch/labkey/_webdav/home/%40files/afile_to_delete.txt?method=DELETE&pageId=4267287a-97a1-1038-bf7a-3c9cbf81b968
Request Method:POST
Remote Address:0.0.0.0:443
Status Code:
403
Version:HTTP/1.1
Referrer Policy:origin-when-cross-origin

The logs don't show any error during this request.

Is there anything else we should modify, or add for this to work? I'm currently out of ideas.

Best regards,
Iñaki

view message
How to track deletion of a row in a dataset?
(5 responses) Chichero 2020-06-19 04:07

I have not found any log information that states explicitly that a particular user deleted an entry in a dataset. In our study I looked in Audit Log - Assay/Experiment Events but only found info about who created an entry, Derive sample from sampleid-X and comment with Run deleted.

Could you help me to find a log with precise information about who deleted, who created and who modified an entry in a dataset and/or a sample set?

view message
List with timestamps unsorted
(3 responses) eva pujadas 2020-06-18 09:23

Dear Labkey team,

We are having a strange issue when creating a new list imported from CSV file. The source CSV file is sorted by one of the columns representing the date and time, but that in the created list in some of the rows this sorting is not preserved after import. I attach an screenshot to visualize what I mean.

I attach also the example of CSV file (with about 85'000 lines). The file contain some "?" instead of numbers, which we are marking as "missing value indicators".

Do you know about some similar issue and what could cause it?

Thanks a lot,
Eva

 Test.csv  forums.png 
view message
File Upload Issue
(3 responses) jgane 2020-05-26 07:54

Hi,

We have a user who has been running into an issue trying to upload a specific HTML file to a files webpart on LabKey version 18.2, every time she does she gets an Unauthorized error stating "You do not have privileges to upload." and when checking the LabKey Primary Site Log File I see the error message below:

WARN FileSystemResource 2020-05-15 15:48:20,754 catalina-exec-16 : user@institution.org attempted to delete file that is not writable by LabKey Server. This may be a configuration problem. file: /mnt/labkey/Release_Prep/Sandbox_and_Resources/SB_CLIN/@files/Data_Quality_Report_VS.html

If I as an admin try to upload the file I am able to successfully, however if I impersonate her account and try I get the same permissions error. She is able to upload other files to this folder just fine and strangely enough she is able to upload the file to an issue in an issue tracker but just not to a files webpart. Additionally, no other users are reporting issues with uploading files.

Any help or insight on this would be appreciated.

Jon

view message
Error on some wiki pages when trying to save new content with varying users
(4 responses) max diesner 2020-05-25 07:38
Hi,

i am a fairly new user to the Labkey enviroment. I have setup a community edition version 20.3. (updated from 19.3.) and everything works when i am editing and uploading files through my main admit account. However, when my colleagues want to edit certain wiki pages we get the following error:

Unable to save changes due to the following validation errors:
Illegal attribute 'onclick' on element <a>.

this happens when we use the "advanced editor"

When we edit the wiki page via the normal editor we just get the message "Error saving wiki".

What is interesting is that this error only pops up at certein wiki pages and not at all. When a user is given rights as "site administrator" the user can edit the page and save it. However as soon as these rights are removed and the user only has the "normal Fodler administrator" rights the problem occurs again. the user can however setup new pages without a problem. The problem only appears at some wiki pages that have been created by my account, the "Site administrator" account.

I know that is has something to do with the rights managment or with the creation of these certain pages but i am lacking the knowledge on how to check where the error comes from.

Looking forward to here from you guys, you are doing an awesome job!

P.S.: The Server is runing on a virtual server, win server 2019 with 8 gig ram. Server version is 20.3. with an updated java 14.0.1+7, tomcat 9.0.26.
view message
Checkbox to select all columns from a table in gridview
(1 response) Edward 2020-05-17 16:25

Hi,

When customising a grid, I have to manually check each column from other tables to make a new grid view. I have wide tables with approximately 150 columns in each, and I have to click 150 times if I want to select all columns from any of these tables. Is there a way to customise my grid by clicking a single checkbox to select all the columns from any of the wide tables?

In the attached screenshot, I have highlighted a checkbox for the whole table but click is disabled for it. Is there any way to enable this checkbox so a user can select all columns from any table with a single click?

 Screenshot.png 
view message
Import large proteomic assay to labkey
(1 response) johann pellet 2020-05-12 09:48

Hi all,

I am trying to upload a large proteomic RMN dataset (9600 columns x 240 rows) into Labkey 19.3.
Below a subset of the matrix:

Sample_ID    9.99950027      9.99849987      9.99750042
  L-1025-01     3547.56219015   4502.33293817   3747.21499051
  L-1025-02     -918.88494389   1544.06141934   -553.62238202

The input matrix consists of N rows of samples with M columns of bin intensities.

Beforehand, I created the sample set NMR samples (below a subset)

Name Volume  Unit
L-1025-01       130.0   uL
L-1025-02       130.0   uL

When I trying to import the Assay, I choose the assay type General and in the Results Fields, I delete all the default fields before I import my matrix to Labkey.

This method does not work because labkey does not accept to import more than 1600 columns. See below an extract of the labkey.log

ERROR BaseApiAction            2020-05-12 13:02:16,710      ajp-nio-8009-exec-5 : ApiAction exception:                                    
org.springframework.dao.DataAccessResourceFailureException: SqlExecutor.execute(); SQL []; ERROR: tabl                                    es can have at most 1600 columns; nested exception is org.postgresql.util.PSQLException: ERROR: tables                                     can have at most 1600 columns

So what I did is to transpose my matrix like that (240 columns x 6900 rows):

Feature_ID  L-1025-01          L-1025-02
9.99950027      3547.56219      -918.8849439
9.99849987      4502.332938     1544.061419
9.99750042      3747.214991     -553.622382

The Assay was created into Labkey after a very long time and an internal server error (I should check the Tomcat and/or Apache configuration).
But now, I don't see how I could import this Assay to my study matching my Sample Set created before. Indeed, when I click to Copy to Study, Participant IDs and Visit IDs are required for all rows. The problem is that each row is not a Sample_ID but a feature.... See images attached.
Ho I could manage this kind of array into Labkey. Should I do it with an other method?
Thank you for your help.

Regards,
Johann

 Annotation 2020-05-12 184227.png 
view message
Patch in 20.3 community edition for Issue 40227: SQLException when importing a Skyline document with a chromatogram library
(5 responses) tvaisar 2020-05-11 04:36

I recently ran into issue when uploading Skyline document with Chromatogram library into a Panorama folder on our Labkey Server. Vagisha pointed me to the Issue 40227: SQLException when importing a Skyline document with a chromatogram library, that she recently resolved.
I wonder if there is/will be a patched version of the v.20.3 community edition. This issue was only recently fixed (4/17) and closed 4/29. Latest v20.3.2 community edition is dated 3/19.

Thanks,
Tomas

view message
FileNotificationCommand failes with "You must use the POST method when calling this action." error
(2 responses) r kursawe 2020-04-24 08:29

I would like to trigger pipeline jobs on a 19.3 server via Java client API. Therefore, I included the labkey-client-api-1.0.1-SNAPSHOT-all.jar in the project and wrote this code:

String user = "username";
String password = "password";
BasicAuthCredentialsProvider cp = new BasicAuthCredentialsProvider(user,password);
Connection cn = new Connection("http://localhost:8080/",cp);
FileNotificationCommand cmd = new FileNotificationCommand();
FileNotificationResponse response = cmd.execute(cn, "dummy collaboration/dummy study");

The server response is a 405 with the message: "You must use the POST method when calling this action." The _httpClientContext property of the Connection object shows that a GET request has been send.

How can I configure to send a POST request?

view message
Cannot access admin console after upgrade to 20.3
(2 responses) m boehmer 2020-04-22 06:05

After a manual upgrade to Labkey Server 20.3, when trying to access the admin console I receive the following error message:

An unexpected error occurred. If contacting support regarding this error, please refer to error code: VUNKPZ
java.lang.NoSuchFieldError: labkeyVersion

Any ideas what might went wrong?

view message
Incremental update support for specimen import
(2 responses) tstellin 2020-04-17 09:14

Hi,

We are curious if LabKey supports (or plans to support) an "incremental" specimen import. This would allow only the specimen event deltas to be specified in the .specimen ZIP archive imported into LabKey each day. It would require complicating the interface--for example, a record would need to signify a "deleted" event (in contrast to the current behavior, where a deletion is signified by the absence of a previously-seen specimen event).

I didn't see anything on the current documentation that suggests an incremental import is supported. My understanding is that the current behavior merges the new .specimen ZIP file with the specimens loaded from the previous import.

Thanks,
-Tobin

view message
Specimens not populating after completed specimen archive import
(1 response) Karen Bates 2020-04-08 08:54

Hello,

I imported a new specimens archive and it said it was completed. However, the specimens are not populating in the Specimens Data tab. I noticed when I was importing the archive, there was not an option for "Replace" or "Merge" as the documentation suggested there would be. But I wasn't sure if that was because this was the first archive import for the study so therefore it's not necessary to replace or merge anything.

Is there an additional step after the import that needs to happen for the specimens to populate?

Thank you for your assistance!

view message
Error while running trigger script
(4 responses) eva pujadas 2020-04-07 09:32

Dear LabKey support,

We have installed a new version of LabKey, 19.3.7, and are detecting a problem when running some trigger scripts attached to study datasets. The same scripts were working in the previous version.

The web interface error message is the following:
"script error: afterInsert trigger closed the connection, possibly due to constraint violation"

The error logs are attached.

The trigger JavaScript script performs some checks in a dataset and tries to insert new rows in two other datasets. The error seems to happen when trying to insert into the second dataset.

Thanks for any ideas about this error's cause.
Best regards,
Eva

 labkey.log 
view message
Custom site welcome page
(1 response) johann pellet 2020-04-02 03:07

Hi all,

I would like more information about how we should proceed to customize an alternative welcome page (Labkey 19.3).
Despite the documentation here: (https://www.labkey.org/Documentation/19.3/wiki-page.view?name=customizeLook) and the help bubble in the Look and Feel Settings labkey page, it's not clear for me how to proceed.

Help buble:
The relative URL of the page, either a full LabKey view or simple HTML resource, to be loaded as the welcome page. The welcome page will be loaded when a user loads the site with no action provided (i.e. https://www.labkey.org). This is often used to provide a splash screen for guests. Note: do not include the contextPath in this string. For example: /myModule/welcome.view to select a view within a module, or /myModule/welcome.html for a simple HTML page in the web directory of your module.

What I tried is

  • to create in /usr/local/labkey/labkey/modules a new folder myModule. Into this folder I created a folder views and a file in views/welcome.html
  • in the Look and feel settings page I choose "myModule.welcome" .
    I also tried with other relative URL without success.

Thank you for your help.

Johann

view message
Bug in version 20.3.0
(1 response) tvaisar 2020-04-01 10:47

Hi,
think there is a bug in v.20.3.0, in the MS2runs view RunSummary - the modifications link shows html code rather than a pop-up with list of modifications. See attached screenshot.

Tomas

 LabKeyServer_bug_20200401.pptx 
view message
Recent Security Update
(2 responses) jgane 2020-03-24 08:28

Hi,

I was just curious if there was any more information about the recent security update that was done in 20.3 and hotfixed into 19.3.7? What did the security issue pertain to? Did it affect any other versions of LabKey before 19.3?

Thanks,
Jon

view message
Updated node and npm versions
Susan Hert 2020-03-20 10:11

This morning I updated the npm and node versions used in our trunk/develop builds. This means that when you pull in this change you are very likely going to see an error like this from node_sass when building:

ERROR in ./src/theme/index.scss (./node_modules/css-loader/dist/cjs.js??ref--5-1!./node_modules/sass-loader/lib/loader.js??ref--5-2!./src/theme/index.scss)
    Module build failed (from ./node_modules/sass-loader/lib/loader.js):
    Error: Missing binding /Users/dev/Development/labkey/trunk/labkey-ui-components/packages/components/node_modules/node-sass/vendor/darwin-x64-72/binding.node
    Node Sass could not find a binding for your current environment: OS X 64-bit with Node.js 12.x
    
    Found bindings for the following environments:
      - OS X 64-bit with Node.js 10.x

To remedy this, you can run ./gradlew cleanNodeModules (or, to be really thorough, ./gradlew -PmoduleSet=all cleanNodeModules) and then run your build command.

view message
LDAP Authentication for Linux - Community Version (Moved from Installation Forum)
(4 responses) joseph mackey 2020-03-06 12:05

LDAP Authentication for Linux - Community Version joseph mackey EDIT 2020-02-21 14:36
Status: Active

Hello,
I have installed Labkey 19.3 Community Version.
While I have a connection string that works from the test screen, I cannot configure it in the LDAP Auth Settings.
I have configured other applications for AD authentication and works when using the sAMAccountName and the full DN (so does the test)
Even when adding the configuration to the labkey.xml file, does not seem to make a difference.

Is there a way to enable verbose logging for LDAP auth attempts?
When adding DN to the LDAP principle template, is there a way to substitute sAMAccountName instead of email or UID?
When using My full DN I am able to log in with my email and AD password but i'm the only user able to login with AD credentials.
Build info:
OS RedHat Enterprise Linux 7.7
Product Name PostgreSQL
Product Version 12.1
JDBC Driver Name PostgreSQL JDBC Driver
JDBC Driver Version 42.2.8
Servlet Container Apache Tomcat/9.0.30
Java Runtime Vendor N/A
Java Runtime Name OpenJDK Runtime Environment
Java Runtime Version 13.0.2+8

chetc (LabKey Support) responded: 2020-03-06 10:55
Hello,

Yes there is. There is no need to enable anything since this is already happening. The labkey.log file should indicate a LDAP login failure and so should the AuditLog found in the admin console.

You should be able to sub for anything. The principal template is used to search through the LDAP global directory and reassociate one value for another.

You can find more information about LDAP Configuration on our Documentation site.
https://www.labkey.org/Documentation/wiki-page.view?name=configLdap

Thanks,
Chet

chetc (LabKey Support) responded: 2020-03-06 10:55
Hello,

We appreciate you utilizing our forums!

However this forum will become inactive very soon. To follow up on this question or to ask new question please use our new forum.

LabKey Support Forum - https://www.labkey.org/home/Support/LabKey Support Forum/project-begin.view?

Thanks,
Chet

view message
ggplotly
(6 responses) dhutchison 2020-02-29 18:09

Hi,

Trying to get an interactive plot (heatmap) working, e.g., ggplotly, but it's not happening...

Any thoughts?

Here's what I've tried:
(saveWidget works fine in RStudio)

data<-as.data.frame(labkey.data)

# plot
ggp <- ggplot(data_melt, aes(Condition, Analyte)) +                           
                   geom_tile(aes(fill = value)) +
                   theme(axis.text.x = element_text( 
                   angle = 90, 
                   size=6)) +
                   scale_x_discrete(breaks = data_melt$Condition) 

works but not interactive and doesn't auto-size:

ggsave("${imgout:labkeyl_png}",plot=ggp,device="png")

save to a temp location not the report:

saveWidget(ggplotly(ggp), file = "test.html")

no error, but report doesn't show up:

saveWidget(ggplotly(ggp), file = "test.html")
rawHTML <- paste(readLines("test.html"),collapse="\n") 
# ${htmlout:output}
write(rawHTML, file="output")

view message
Error When Setting Up Developer Machine
(1 response) nazric55 2020-02-02 12:41

I've been following the documentation for setting up a developer machine (https://www.labkey.org/Documentation/wiki-page.view?name=devMachine) but I ran into issues in the "Build LabKey" section.

When running the command "gradlew deployApp" the build fails immediately after the "Task :server:stageModules" starts. (stacktrace pasted below)

The only other inconsistency I found in the guide was in the "Configure the LabKey Project in IntelliJ" section when it asks to edit the configurations there is no modules named "api_main" or "org.labkey-api_main", so I continued with "api" instead.

Other configuration information:
apache-tomcat-9.0.30
jdk-13.0.2 (OpenJDK)
When running the gradlew command I see that it's using Gradle 6.0
SVN revision 65036

Stacktrace from error (using --stacktrace flag)

Task :server:stageModules FAILED

FAILURE: Build failed with an exception.

  • What went wrong:
    Execution failed for task ':server:stageModules'.

Specify at least one source--a file or a resource collection.

  • Try:
    Run with --info or --debug option to get more log output. Run with --scan to get full insights.

  • Exception is:
    org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':server:stageModules'.
    at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.lambda$executeIfValid$1(ExecuteActionsTaskExecuter.java:187)
    at org.gradle.internal.Try$Failure.ifSuccessfulOrElse(Try.java:263)
    at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeIfValid(ExecuteActionsTaskExecuter.java:185)
    at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:166)
    at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:109)
    at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:46)
    at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:62)
    at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
    at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56)
    at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
    at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:77)
    at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:55)
    at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:52)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:416)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:406)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:102)
    at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
    at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:52)
    at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:41)
    at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:374)
    at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:361)
    at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:354)
    at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:340)
    at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.lambda$run$0(DefaultPlanExecutor.java:127)
    at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:191)
    at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:182)
    at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:124)
    at org.gradle.execution.plan.DefaultPlanExecutor.process(DefaultPlanExecutor.java:72)
    at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph.executeWithServices(DefaultTaskExecutionGraph.java:191)
    at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph.execute(DefaultTaskExecutionGraph.java:168)
    at org.gradle.execution.SelectedTaskExecutionAction.execute(SelectedTaskExecutionAction.java:41)
    at org.gradle.execution.DefaultBuildWorkExecutor.execute(DefaultBuildWorkExecutor.java:40)
    at org.gradle.execution.DefaultBuildWorkExecutor.access$000(DefaultBuildWorkExecutor.java:24)
    at org.gradle.execution.DefaultBuildWorkExecutor$1.proceed(DefaultBuildWorkExecutor.java:48)
    at org.gradle.execution.DryRunBuildExecutionAction.execute(DryRunBuildExecutionAction.java:49)
    at org.gradle.execution.DefaultBuildWorkExecutor.execute(DefaultBuildWorkExecutor.java:40)
    at org.gradle.execution.DefaultBuildWorkExecutor.execute(DefaultBuildWorkExecutor.java:33)
    at org.gradle.execution.IncludedBuildLifecycleBuildWorkExecutor.execute(IncludedBuildLifecycleBuildWorkExecutor.java:36)
    at org.gradle.execution.DeprecateUndefinedBuildWorkExecutor.execute(DeprecateUndefinedBuildWorkExecutor.java:39)
    at org.gradle.execution.BuildOperationFiringBuildWorkerExecutor$ExecuteTasks.run(BuildOperationFiringBuildWorkerExecutor.java:56)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:402)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:394)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:92)
    at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
    at org.gradle.execution.BuildOperationFiringBuildWorkerExecutor.execute(BuildOperationFiringBuildWorkerExecutor.java:41)
    at org.gradle.initialization.DefaultGradleLauncher.runWork(DefaultGradleLauncher.java:241)
    at org.gradle.initialization.DefaultGradleLauncher.doClassicBuildStages(DefaultGradleLauncher.java:151)
    at org.gradle.initialization.DefaultGradleLauncher.doBuildStages(DefaultGradleLauncher.java:130)
    at org.gradle.initialization.DefaultGradleLauncher.executeTasks(DefaultGradleLauncher.java:110)
    at org.gradle.internal.invocation.GradleBuildController$1.execute(GradleBuildController.java:60)
    at org.gradle.internal.invocation.GradleBuildController$1.execute(GradleBuildController.java:57)
    at org.gradle.internal.invocation.GradleBuildController$3.create(GradleBuildController.java:85)
    at org.gradle.internal.invocation.GradleBuildController$3.create(GradleBuildController.java:78)
    at org.gradle.internal.work.DefaultWorkerLeaseService.withLocks(DefaultWorkerLeaseService.java:189)
    at org.gradle.internal.work.StopShieldingWorkerLeaseService.withLocks(StopShieldingWorkerLeaseService.java:40)
    at org.gradle.internal.invocation.GradleBuildController.doBuild(GradleBuildController.java:78)
    at org.gradle.internal.invocation.GradleBuildController.run(GradleBuildController.java:57)
    at org.gradle.tooling.internal.provider.ExecuteBuildActionRunner.run(ExecuteBuildActionRunner.java:31)
    at org.gradle.launcher.exec.ChainingBuildActionRunner.run(ChainingBuildActionRunner.java:35)
    at org.gradle.launcher.exec.BuildOutcomeReportingBuildActionRunner.run(BuildOutcomeReportingBuildActionRunner.java:63)
    at org.gradle.tooling.internal.provider.ValidatingBuildActionRunner.run(ValidatingBuildActionRunner.java:32)
    at org.gradle.launcher.exec.BuildCompletionNotifyingBuildActionRunner.run(BuildCompletionNotifyingBuildActionRunner.java:39)
    at org.gradle.launcher.exec.RunAsBuildOperationBuildActionRunner$3.call(RunAsBuildOperationBuildActionRunner.java:51)
    at org.gradle.launcher.exec.RunAsBuildOperationBuildActionRunner$3.call(RunAsBuildOperationBuildActionRunner.java:45)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:416)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:406)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:102)
    at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
    at org.gradle.launcher.exec.RunAsBuildOperationBuildActionRunner.run(RunAsBuildOperationBuildActionRunner.java:45)
    at org.gradle.launcher.exec.InProcessBuildActionExecuter$1.transform(InProcessBuildActionExecuter.java:50)
    at org.gradle.launcher.exec.InProcessBuildActionExecuter$1.transform(InProcessBuildActionExecuter.java:47)
    at org.gradle.composite.internal.DefaultRootBuildState.run(DefaultRootBuildState.java:78)
    at org.gradle.launcher.exec.InProcessBuildActionExecuter.execute(InProcessBuildActionExecuter.java:47)
    at org.gradle.launcher.exec.InProcessBuildActionExecuter.execute(InProcessBuildActionExecuter.java:31)
    at org.gradle.launcher.exec.BuildTreeScopeBuildActionExecuter.execute(BuildTreeScopeBuildActionExecuter.java:42)
    at org.gradle.launcher.exec.BuildTreeScopeBuildActionExecuter.execute(BuildTreeScopeBuildActionExecuter.java:28)
    at org.gradle.tooling.internal.provider.ContinuousBuildActionExecuter.execute(ContinuousBuildActionExecuter.java:78)
    at org.gradle.tooling.internal.provider.ContinuousBuildActionExecuter.execute(ContinuousBuildActionExecuter.java:52)
    at org.gradle.tooling.internal.provider.SubscribableBuildActionExecuter.execute(SubscribableBuildActionExecuter.java:59)
    at org.gradle.tooling.internal.provider.SubscribableBuildActionExecuter.execute(SubscribableBuildActionExecuter.java:36)
    at org.gradle.tooling.internal.provider.SessionScopeBuildActionExecuter.execute(SessionScopeBuildActionExecuter.java:68)
    at org.gradle.tooling.internal.provider.SessionScopeBuildActionExecuter.execute(SessionScopeBuildActionExecuter.java:38)
    at org.gradle.tooling.internal.provider.GradleThreadBuildActionExecuter.execute(GradleThreadBuildActionExecuter.java:37)
    at org.gradle.tooling.internal.provider.GradleThreadBuildActionExecuter.execute(GradleThreadBuildActionExecuter.java:26)
    at org.gradle.tooling.internal.provider.ParallelismConfigurationBuildActionExecuter.execute(ParallelismConfigurationBuildActionExecuter.java:43)
    at org.gradle.tooling.internal.provider.ParallelismConfigurationBuildActionExecuter.execute(ParallelismConfigurationBuildActionExecuter.java:29)
    at org.gradle.tooling.internal.provider.StartParamsValidatingActionExecuter.execute(StartParamsValidatingActionExecuter.java:60)
    at org.gradle.tooling.internal.provider.StartParamsValidatingActionExecuter.execute(StartParamsValidatingActionExecuter.java:32)
    at org.gradle.tooling.internal.provider.SessionFailureReportingActionExecuter.execute(SessionFailureReportingActionExecuter.java:55)
    at org.gradle.tooling.internal.provider.SessionFailureReportingActionExecuter.execute(SessionFailureReportingActionExecuter.java:41)
    at org.gradle.tooling.internal.provider.SetupLoggingActionExecuter.execute(SetupLoggingActionExecuter.java:48)
    at org.gradle.tooling.internal.provider.SetupLoggingActionExecuter.execute(SetupLoggingActionExecuter.java:32)
    at org.gradle.launcher.daemon.server.exec.ExecuteBuild.doBuild(ExecuteBuild.java:68)
    at org.gradle.launcher.daemon.server.exec.BuildCommandOnly.execute(BuildCommandOnly.java:37)
    at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
    at org.gradle.launcher.daemon.server.exec.WatchForDisconnection.execute(WatchForDisconnection.java:39)
    at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
    at org.gradle.launcher.daemon.server.exec.ResetDeprecationLogger.execute(ResetDeprecationLogger.java:27)
    at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
    at org.gradle.launcher.daemon.server.exec.RequestStopIfSingleUsedDaemon.execute(RequestStopIfSingleUsedDaemon.java:35)
    at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
    at org.gradle.launcher.daemon.server.exec.ForwardClientInput$2.create(ForwardClientInput.java:78)
    at org.gradle.launcher.daemon.server.exec.ForwardClientInput$2.create(ForwardClientInput.java:75)
    at org.gradle.util.Swapper.swap(Swapper.java:38)
    at org.gradle.launcher.daemon.server.exec.ForwardClientInput.execute(ForwardClientInput.java:75)
    at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
    at org.gradle.launcher.daemon.server.exec.LogAndCheckHealth.execute(LogAndCheckHealth.java:55)
    at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
    at org.gradle.launcher.daemon.server.exec.LogToClient.doBuild(LogToClient.java:63)
    at org.gradle.launcher.daemon.server.exec.BuildCommandOnly.execute(BuildCommandOnly.java:37)
    at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
    at org.gradle.launcher.daemon.server.exec.EstablishBuildEnvironment.doBuild(EstablishBuildEnvironment.java:82)
    at org.gradle.launcher.daemon.server.exec.BuildCommandOnly.execute(BuildCommandOnly.java:37)
    at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
    at org.gradle.launcher.daemon.server.exec.StartBuildOrRespondWithBusy$1.run(StartBuildOrRespondWithBusy.java:52)
    at org.gradle.launcher.daemon.server.DaemonStateCoordinator$1.run(DaemonStateCoordinator.java:297)
    at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    Caused by: : Specify at least one source--a file or a resource collection.
    at org.apache.tools.ant.taskdefs.Copy.validateAttributes(Copy.java:678)
    at org.apache.tools.ant.taskdefs.Copy.execute(Copy.java:450)
    at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:292)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:99)
    at org.gradle.api.internal.project.ant.BasicAntBuilder.nodeCompleted(BasicAntBuilder.java:80)
    at org.gradle.api.internal.project.ant.BasicAntBuilder.doInvokeMethod(BasicAntBuilder.java:107)
    at org.labkey.gradle.plugin.ServerDeploy$_addTasks_closure2$_closure23.doCall(ServerDeploy.groovy:85)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at org.gradle.api.internal.AbstractTask$ClosureTaskAction.execute(AbstractTask.java:664)
    at org.gradle.api.internal.AbstractTask$ClosureTaskAction.execute(AbstractTask.java:637)
    at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$3.run(ExecuteActionsTaskExecuter.java:539)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:402)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:394)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
    at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:92)
    at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
    at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:524)
    at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:507)
    at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.access$300(ExecuteActionsTaskExecuter.java:109)
    at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution.executeWithPreviousOutputFiles(ExecuteActionsTaskExecuter.java:258)
    at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution.execute(ExecuteActionsTaskExecuter.java:247)
    at org.gradle.internal.execution.steps.ExecuteStep.lambda$execute$1(ExecuteStep.java:33)
    at org.gradle.internal.execution.steps.ExecuteStep.execute(ExecuteStep.java:33)
    at org.gradle.internal.execution.steps.ExecuteStep.execute(ExecuteStep.java:26)
    at org.gradle.internal.execution.steps.CleanupOutputsStep.execute(CleanupOutputsStep.java:63)
    at org.gradle.internal.execution.steps.CleanupOutputsStep.execute(CleanupOutputsStep.java:35)
    at org.gradle.internal.execution.steps.ResolveInputChangesStep.execute(ResolveInputChangesStep.java:49)
    at org.gradle.internal.execution.steps.ResolveInputChangesStep.execute(ResolveInputChangesStep.java:34)
    at org.gradle.internal.execution.steps.CancelExecutionStep.execute(CancelExecutionStep.java:43)
    at org.gradle.internal.execution.steps.TimeoutStep.executeWithoutTimeout(TimeoutStep.java:73)
    at org.gradle.internal.execution.steps.TimeoutStep.execute(TimeoutStep.java:54)
    at org.gradle.internal.execution.steps.CatchExceptionStep.execute(CatchExceptionStep.java:34)
    at org.gradle.internal.execution.steps.CreateOutputsStep.execute(CreateOutputsStep.java:44)
    at org.gradle.internal.execution.steps.SnapshotOutputsStep.execute(SnapshotOutputsStep.java:54)
    at org.gradle.internal.execution.steps.SnapshotOutputsStep.execute(SnapshotOutputsStep.java:38)
    at org.gradle.internal.execution.steps.CacheStep.executeWithoutCache(CacheStep.java:153)
    at org.gradle.internal.execution.steps.CacheStep.execute(CacheStep.java:67)
    at org.gradle.internal.execution.steps.CacheStep.execute(CacheStep.java:41)
    at org.gradle.internal.execution.steps.BroadcastChangingOutputsStep.execute(BroadcastChangingOutputsStep.java:49)
    at org.gradle.internal.execution.steps.StoreExecutionStateStep.execute(StoreExecutionStateStep.java:44)
    at org.gradle.internal.execution.steps.StoreExecutionStateStep.execute(StoreExecutionStateStep.java:33)
    at org.gradle.internal.execution.steps.RecordOutputsStep.execute(RecordOutputsStep.java:38)
    at org.gradle.internal.execution.steps.RecordOutputsStep.execute(RecordOutputsStep.java:24)
    at org.gradle.internal.execution.steps.SkipUpToDateStep.executeBecause(SkipUpToDateStep.java:92)
    at org.gradle.internal.execution.steps.SkipUpToDateStep.lambda$execute$0(SkipUpToDateStep.java:85)
    at org.gradle.internal.execution.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:55)
    at org.gradle.internal.execution.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:39)
    at org.gradle.internal.execution.steps.ResolveChangesStep.execute(ResolveChangesStep.java:76)
    at org.gradle.internal.execution.steps.ResolveChangesStep.execute(ResolveChangesStep.java:37)
    at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsFinishedStep.execute(MarkSnapshottingInputsFinishedStep.java:36)
    at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsFinishedStep.execute(MarkSnapshottingInputsFinishedStep.java:26)
    at org.gradle.internal.execution.steps.ResolveCachingStateStep.execute(ResolveCachingStateStep.java:94)
    at org.gradle.internal.execution.steps.ResolveCachingStateStep.execute(ResolveCachingStateStep.java:49)
    at org.gradle.internal.execution.steps.CaptureStateBeforeExecutionStep.execute(CaptureStateBeforeExecutionStep.java:79)
    at org.gradle.internal.execution.steps.CaptureStateBeforeExecutionStep.execute(CaptureStateBeforeExecutionStep.java:53)
    at org.gradle.internal.execution.steps.ValidateStep.execute(ValidateStep.java:74)
    at org.gradle.internal.execution.steps.SkipEmptyWorkStep.lambda$execute$2(SkipEmptyWorkStep.java:78)
    at org.gradle.internal.execution.steps.SkipEmptyWorkStep.execute(SkipEmptyWorkStep.java:78)
    at org.gradle.internal.execution.steps.SkipEmptyWorkStep.execute(SkipEmptyWorkStep.java:34)
    at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsStartedStep.execute(MarkSnapshottingInputsStartedStep.java:39)
    at org.gradle.internal.execution.steps.LoadExecutionStateStep.execute(LoadExecutionStateStep.java:40)
    at org.gradle.internal.execution.steps.LoadExecutionStateStep.execute(LoadExecutionStateStep.java:28)
    at org.gradle.internal.execution.impl.DefaultWorkExecutor.execute(DefaultWorkExecutor.java:33)
    at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeIfValid(ExecuteActionsTaskExecuter.java:174)
    ... 125 more

  • Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.0/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2s
5 actionable tasks: 1 executed, 4 up-to-date

view message
Transformation script that also queries a LabKey List/Assay?
(1 response) david sievers 2020-01-28 09:34

I'm developing a host of transformation scripts in Python, and I've been successful so far getting a couple examples to function. However, now I would like to be able to access more input variables, shall I say metadata, that I'm also storing in the parent LabKey Project folder as a List. I can access what is in the runProperties.tsv file, but I don't think this or the paths within will get me where I need to go.

What I'd like to do is validate naming conventions of incoming data columns against a directory of valid names within the transformation script. (Of course I could just use the Assay design itself to validate against during the import process, but I need to actually do this within the transformation script for my application.) I had initially thought of storing this directory in a text file next to the transformation script or also creating a list/set/dictionary object directly in the script. However, placing this data as a transparent and easily accessible LabKey List (or some other type of LK object) is most desirable.

Is there a preferred method of accessing LK Assay/List/DataClass tables from a transformation script? I could try using the LK API tools, but I'd like to avoid additional outside module import/usage within the script, and also avoid needing user credentials within the script, if there is a more direct method.

Thanks!

view message
LabKey Teamcity Server Maintenance 12/31/2019 @ 3 PM
(1 response) stuartm 2019-12-31 14:52

We will be preforming scheduled maintenance on the LabKey TeamCity build server today at 3 PM PST to address some latency issues with the TeamCity Database. Downtime is expected to be approximately 30 mins.

During the maintenance window the TeamCity build queue will be paused and resumed once the maintenance has been completed.

view message
datasetcolumns table not showing fields imported through assay module
(1 response) WayneH 2019-12-05 13:18

I imported some data to datasets via the assay module.. Wanting to produce a table describing the queries and fields in the schema but found that the datasetcolumns table in the study schema only shows fields created by the user not those created by the assay module.. Kinda odd..? Am I missing something? Don't want to ETL copy of data to another dataset for reporting..

Any thoughts?

Thanks,

Wayne

view message
Question regarding LDAP in upcoming release
(1 response) bront 2019-12-03 09:35

Hi,

Is it true that LDAP will be a premium feature in version 20.1? Will database authentication remain in the community edition?

https://www.labkey.org/Documentation/wiki-page.view?name=configLdap

thanks,

bront

view message
Creating views from queries via API
(1 response) millerjw4 2019-11-13 12:09

Hello,

Currently we are using LabKey to connect to an external MSSQL data source. When the data source is connected, it gives the ability to view these tables via the built-in queries on the connected schema. However, I would like all of these views to be available for users of the project by default and not have to go in and manually create a Query Report based on each of the 100 or so queries.

Is there any way to insert these query report views into the labkey project programmatically?

Thanks,
Jonathan

view message
Project User View error code 45B01R?
(2 responses) WayneH 2019-11-06 07:13

Good morning..

We recently found an error in one of our projects that has multiple group assignments to users. It appears as if the group ids are being interpreted as a string instead of integer values when there are multiple. One of the errors codes we got is 45B01R.. It generally says it cannot cannot convert the group ids to integer. We have two server implementations same version this one is the only one that shows the error.. the behavior this causes is the project user view cannot show the complete list of users because it stops rendering the list when it hits the error.

Any thoughts?

view message
Is there a way to create a container from the java api?
(3 responses) ccharpen 2019-10-30 15:53

I was able to get all containers programmatically, but now I need to create them. I thought, if there was such an API, it'd be here: https://www.labkey.org/download/clientapi_docs/java-api-9.3/doc/org/labkey/remoteapi/PostCommand.html

But it's not. Any suggestions?

view message
Wrong column selected by Rlabkey when retrieving visits
(5 responses) Edward 2019-10-21 06:26

I have a home folder where in I have five different sub-studies. In each of the sub-study, the visits starts from 1.1 to 12.1. When I try to retrieve rows from the tables in each of the study, only the first study returns correct sequences, i.e 1.1-12.1. Other sub-studies, return integer values such 7-19 or 48-60. When I tried to import the visit map through the following code, I found out that actually these returned integers are Rows Id and not the sequence number or the visit label. Can you kindly fix this issue? or is there any alternative to get all the columns of a table (except the hidden ones) plus the visit label?

labkey.data <- labkey.selectRows(
    baseUrl=labkey.url.base,
    folderPath=labkey.url.path,
    schemaName="study",
    queryName="Visit"
)


view message
Gradle Plugins Version 1.8
Susan Hert 2019-10-17 14:45

As of r64631 in trunk, the gradlePluginsVersion has been updated to v1.8. This version brings some exciting (to a build-geek) changes, including:

  • Removal of the schemas jar for all modules (the schemas classes are now compiled into the implementation jar for each module that has them)
  • Renaming of the jsp jar files (to remove the classifiers)

Because of these jar changes, once you pull in this change, you will want to do a cleanBuild. Without this, you'll have stale schema jars in your build directory and you will likely end up with two copies of some of the JSP jars, which could make for confusing behavior.

With this plugin version and going forward with LabKey 19.3, we will also be publishing the artifacts of our builds in Artifactory a little differently. Namely:

  • We will publish the api jars under the group org.labkey.api (e.g., you can reference the core api jar file as org.labkey.api:core:19.3-SNAPSHOT)
  • We will publish the .module files under the group org.labkey.module (e.g., you can reference the core module file org.labkey.module:core:19.3-SNAPSHOT@module. The "@module" is necessary because the extension of that file is .module instead of .jar).

You shouldn't need to make any changes in your current build as a result of this unless you had explicitly been referencing artifacts using 'org.labkey' as the group instead of using the BuildUtils.addLabKeyDependency method.

The .module files have pom files published with them that declare dependencies on other modules, which allows for an easier mechanism for getting all the modules required for a valid server deployment. We are, however, still auditing these dependency declarations to make sure they capture everything they should and there will likely be refinements of this in the future.

To enable the publishing of the module dependencies, the use of the property moduleDependencies in the module.properties file is now deprecated in favor of declaring dependencies within a module's build.gradle file. See more information on how to do this here.

If you have any questions or problems, please let us know.

Susan

view message
Get list of all folders (Java)
(4 responses) ccharpen 2019-10-07 16:16

As the title says, I want to get a hierarchical list of all folders.

This is what I've tried:

// Taking a stab in the dark here with the 'core' schema and 'Containers' query. 
SelectRowsCommand cmd = new SelectRowsCommand("core", "Containers");

// null for root. I was hoping this would return something useful, but as far as I can tell, it doesn't. 
SelectRowsResponse resp = cmd.execute(cn, null);

Thanks.

view message
End session via API
(1 response) WayneH 2019-09-13 08:30

Good afternoon,

just wondering if there was a way to end a session or logout via the server api .. It doesn't seem to be described in the documentation..

Thanks,

Wayne.

view message
Create Project through Python API
(4 responses) David Owen 2019-09-12 07:51

Is it possible to dynamically create a project through the Python API? I couldn't find much information in the documentation. If not, are there any other tools I can use to programmatically create projects?

Thanks,
Dave

view message
Duplicate rows query
(1 response) fcf 2019-06-24 16:12

I am working off a list and trying to duplicate rows to allow editing of the duplicates. It is for an order book where people can re-order supplies, making changes to the previous orders as necessary. I am modeling the code after a bulk edit, but I am having trouble selecting data to be edited and inserted. Here is the bulk edit code:

                var url = LABKEY.ActionURL.buildURL('query', 'updateQueryRows.view', null, {
                       schemaName: dataRegion.schemaName,
                       'query.queryName': dataRegion.queryName,
                       dataRegionSelectionKey: dataRegion.selectionKey
                   });
                   var form = dataRegion.form;
                   console.log(typeof form);
                   if (form &amp;&amp; verifySelected.call(this, form, url, 'POST', 'rows')) {
                       submitForm(form);
                   }
                   return false;

I thought it would be appropriate to change updateQueryRows to some form of a select statement, but cannot get it to work. Is there a duplicate rows query function? Any ideas?

view message
XML Schema error on manual entry of dataset with NUMERIC datatype
(1 response) Georgia Mayfield 2019-06-24 15:10

Hello,

I got an XMLSchema error on implementing a dataset in LabKey running on Postgres 9.5 regarding the NUMERIC datatype. The dataset was implemented with a required column of type NUMERIC, without a precision or scale specified. This is a valid datatype is Postgresql, and according to the LabKey SQL documentation is a LabKey defined valid datatype.

Our users have specifically requested the numeric datatype as it allows both integer and decimal values without padding zeroes after the decimal point, unlike the INTEGER or DOUBLE datatypes, the concern being that padded zeroes may appear to add degrees of certainty to an assay result which may or may not be true.

The dataset is able to instantiate with the datatype fine, and bulk upload with both integer and decimal values completes successfully and can be repeated.

However, on single entry I get a error related to the type. I am able to load one record without error, and the second time I click on the INSERT button the attached error takes place. This happens regardless of whether or not I insert an integer or decimal In the NUMERIC column during single entry. Seems like single entry is treating the NUMERIC datatype like the INTEGER datatype, and bulk upload is handling it appropriately.

Thanks!

Best, GM

 single_entry_failure.png 
view message
Remove avatar field in user details
(2 responses) WayneH 2019-06-17 08:30

Good morning,
not sure if anyone else has had this issue or question but we wanted to remove the avatar field from the user profile/user details view for a project and can't seem to find a way to do so.. The field is hidden in the default user view but this seems to have no impact on what shows when users edit their profile..

We welcome any ideas..

Thanks,

view message
Row not updating using updateRows()
(1 response) la27 2019-06-05 09:38
Hello,

I have Java script code that says this:

 LABKEY.Query.updateRows({
        schemaName:"lists",
        queryName :"Reagent Requests",
        rows: [{"Key": 3,
            "ClientId": "123",
        }]
    });

I am trying to update the Reageant Requests Tutorial Form's row 3's cell named ClientId with 123. I have made sure the keyname is Key but it will not update. It gives me "existing row not found." I put this function in the submitRequest() function found in the tutorial code.

   function submitRequest() {

        // Insert form data into the list.
       LABKEY.Query.insertRows({
           schemaName: 'lists',
           queryName: 'Reagent Requests',
           rowDataArray: [{
               "Name": document.ReagentReqForm.DisplayName.value,
               "Email": document.ReagentReqForm.Email.value,
               "UserID": document.ReagentReqForm.UserID.value,
               "Reagent": document.ReagentReqForm.Reagent.value,
               "Quantity": parseInt(document.ReagentReqForm.Quantity.value),
               "Date": new Date(),
               "Comments": document.ReagentReqForm.Comments.value,
               "Fulfilled": 'false'
           }],
           success: function(data) {
               // The set of URL parameters.
               var params = {
                   "name": 'confirmation', // The destination wiki page. The name of this parameter is not arbitrary.
                   "userid": LABKEY.Security.currentUser.id // The name of this parameter is arbitrary.
               };

                // This changes the page after building the URL. Note that the wiki page destination name is set in params.
               var wikiURL = LABKEY.ActionURL.buildURL("wiki", "page", LABKEY.ActionURL.getContainer(), params);
               window.location = wikiURL;
           }
       });
   
    LABKEY.Query.updateRows({
        schemaName:"lists",
        queryName :"Reagent Requests",
        rows: [{"Key": "3",
            "ClientId": "123",
               }]
               });
               
}


 </script>
view message
Scheduled Maintenance: artifactory.labkey.com Friday May 31, 2019 7 AM - 10 AM PDT
(1 response) stuartm 2019-05-30 11:40

We will be performing scheduled maintenance on LabKey's artifactory server ( https://artifactory.labkey.com/ ) on Friday May 31 2019 between 7AM and 10 AM PDT.

This will affect build artifacts during this maintenance period.

While Artifactory is unavailable, you should use the --offline flag for gradle when building.

We are performing this maintenance to install the latest security patches for Artifactory and its underlying operating system.

view message
Git Migration: Test and customModules
Trey 2019-05-29 14:59

We just finished the next step in our effort to migrate our source code from SVN to Git. The previous step was the creation of the platform and commonAssays repositories.

This time, we are migrating our central automated test code (from server/test) and all of the modules from server/customModules. The bulk of this change happed in SVN revision 63845.

Steps for updating your local enlistment to the new structure:

  1. Before syncing SVN, create a patch with any pending changes to server/test, server/customModules, externalModules/onprcEHRModules, externalModules/maccoss, or sampledata/qc
  2. Get rid of the dependencies.txt files that are generated in some of the resources/credits directories for modules that have moved. You can do this manually or with the gradle command: './gradlew cleanWriteDependenciesList'
  3. Do an SVN update
    • You may encounter tree conflicts if you skipped the previous step. Delete the directories manually and resolve the tree conflicts to continue.
  4. Clone 'https://github.com/LabKey/testAutomation.git' into the 'server' directory
    • Optional - only necessary if you wish to run automated tests
  5. Clone any, all, or none of the new module repositories into 'server/modules'
  6. Update any customizations to your settings.gradle file to use the new module locations
view message
500: Unexpected server error when deleting folders
(2 responses) eva pujadas 2019-05-24 07:31

We are running LabKey 18.3 and have realized that project users (non site admin users), users with "Project Administrator" and "Folder Administrator" can not delete the folders (or sub-folders) which were created by themselves.

That is the error message:

500: Unexpected server error

java.lang.NullPointerException
at org.labkey.pipeline.api.PipelineManager.purge(PipelineManager.java:240)
at org.labkey.pipeline.PipelineModule.containerDeleted(PipelineModule.java:237)
at org.labkey.api.data.ContainerManager.fireDeleteContainer(ContainerManager.java:2108)
at org.labkey.api.data.ContainerManager.delete(ContainerManager.java:1573)
at org.labkey.core.admin.AdminController$DeleteFolderAction.handlePost(AdminController.java:6823)
at org.labkey.core.admin.AdminController$DeleteFolderAction.handlePost(AdminController.java:6784)
at org.labkey.api.action.FormViewAction.handleRequest(FormViewAction.java:101)
at org.labkey.api.action.FormViewAction.handleRequest(FormViewAction.java:80)
at org.labkey.api.action.BaseViewAction.handleRequest(BaseViewAction.java:177)
at org.labkey.api.action.SpringActionController.handleRequest(SpringActionController.java:482)
at org.labkey.api.module.DefaultModule.dispatch(DefaultModule.java:1266)
at org.labkey.api.view.ViewServlet._service(ViewServlet.java:204)
at org.labkey.api.view.ViewServlet.service(ViewServlet.java:131)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:742)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:231)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
at org.labkey.api.data.TransactionFilter.doFilter(TransactionFilter.java:38)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
at org.labkey.api.module.ModuleLoader.doFilter(ModuleLoader.java:1241)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
at org.labkey.api.security.AuthFilter.doFilter(AuthFilter.java:215)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
at org.labkey.core.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:118)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:198)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:96)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:493)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:81)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:650)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:342)
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:800)
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:800)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1471)
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
request attributes
LABKEY.OriginalURL = http://labkey.scicore.unibas.ch/labkey/Handschin Group/test3/admin-deleteFolder.view?
LABKEY.StartTime = 1558707176893
LABKEY.action = deleteFolder
org.springframework.web.servlet.DispatcherServlet.CONTEXT = Root WebApplicationContext: startup date [Thu Feb 21 09:56:07 CET 2019]; parent: Root WebApplicationContext
LABKEY.controller = admin
LABKEY.Counter = 0
org.labkey.api.util.ExceptionUtil$exception = java.lang.NullPointerException
X-LABKEY-CSRF = 5013174ea2e084d8241882c5d776835a
LABKEY.container = /Handschin Group/test3
LABKEY.RequestURL = /labkey/Handschin%20Group/test3/admin-deleteFolder.view?
LABKEY.OriginalURLHelper = /labkey/Handschin%20Group/test3/admin-deleteFolder.view?

core schema database configuration
Server URL jdbc:postgresql://127.0.0.1/labkey
Product Name PostgreSQL
Product Version 9.6.6
Driver Name PostgreSQL JDBC Driver
Driver Version 42.2.5

Do you have any idea about which could be the error cause?

Thank you very much for your support,
Eva

view message
Webpart permissions per user or group
(1 response) Matt V 2019-05-14 09:59

Hello,
I'm creating a few new webparts and we need to limit who can see them. The permissions matrix (https://www.labkey.org/Documentation/wiki-page.view?name=webpartPermissions) has usually done the trick, but in this case it would be incredibly useful (borderline necessary) to be able to limit their display to groups or even specific users if groups aren't an option.

Is there a way to do this, even at a code level?

view message
list permissions
(1 response) WayneH 2019-05-08 11:29

looking for some suggestions regarding how to configure permissions such that project users are able to submit/edit data in a list via a js coded wiki page. I want to give them editor permissions over data content but not over wiki or other content..

thanks,

view message
Assign and manage folder permissions for containers..
(2 responses) WayneH 2019-04-29 08:59

Hi Jon,
kind of in follow up to previous but hopefully asking the question a bit more clearly what we have is a list of about 1000 folders that are to received uploaded files from collaborators that they also need to have access to.. We have recreated these folders as containers and now need to assign permissions to control access to data to specific users. And hoping to find a way to have those users access their folder directly without seeing other extraneous data. (minimized view) We just want them to see the folders that we are pushing the users to. Basically we need to add group permissions to containers at scale using the javascript api for over 1000 folders/containers.
Just a project that I am working with someone on and trying to help them figure out how to manage these folders and data access via the js API. It's tough because we are dealing with such a large number of folders..

view message
WebDAV views
(1 response) WayneH 2019-04-10 10:05

Hey everyone, Is there any way to configure the labkey webDAV/webfiles client user view or functionality?
we would love to hide the folder tree from some users as well as the home folder in the tree in instances where we are showing the tree we would prefer to not show the home folder... Not at all familiar with this aspect so any help would be appreciated.

Thanks,

view message
Modules Restart
(1 response) marcia hon 2019-03-21 11:21

Hello,

Currently, when I have a new module, I end up restarting the whole LabKey application.

Is there a way for updating/adding modules without restarting the whole LabKey system?

Thanks,
Marcia

view message
Core migration to Git
(1 response) avital 2019-03-20 15:00

Hello -

LabKey is in the process of migrating our source code from SVN to GitHub. This Friday evening we will be migrating the server/ api, internal and modules directories to Git. This change will only affect SVN Trunk. After the migration the new repos will be found here: https://github.com/LabKey/platform, https://github.com/LabKey/commonassays. Once the migration is finished, we will announce the all-clear with final instructions on enlistment.

Thank you,
Avital

view message
Assign group permissions via js api
(2 responses) WayneH 2019-03-06 06:28

Good morning,

We see in the javascript API reference methods for creating containers, user accounts and groups and adding members to the groups.. We are unclear on how we set permissions/security policy on the groups we create in order to define roles and permissions for users in the project. Can you point us in the right direction for this.. Our aim is to manage groups and users profiles and permissions programmatically.

Thanks,

WH

view message
Error with Foreign Keys
(1 response) marcia hon 2019-02-28 11:44

This issue happens sometimes with different values, and I don't understand why this happens:

I have a custom table created via ETL. It has a column that references a lookup table.

When I put legitimate data into this column, I get the following error:

SqlExecutor.execute(); SQL []; ERROR: insert or update on table "samples_samplesnotreceived" violates foreign key constraint "fk_samsamnotrecdiagnosis1"
Detail: Key (diagnosis1)=(Hallucinogen Persisting Perception Disorder) is not present in table "diagnosis".; nested exception is org.postgresql.util.PSQLException: ERROR: insert or update on table "samples_samplesnotreceived" violates foreign key constraint "fk_samsamnotrecdiagnosis1"
Detail: Key (diagnosis1)=(Hallucinogen Persisting Perception Disorder) is not present in table "diagnosis".

"Hallucinogen Persisting Perception Disorder" is indeed in the "diagnosis" table but it is unable to recognize it.

Please let me know how to correct this.

Thanks,
Marcia

view message
Auditing Custom Tables
(5 responses) marcia hon 2019-02-28 10:34

Hello,

I created a custom Schema and corresponding custom tables.

How would I enable auditing of these tables? Is there some special commands in XML?

Thanks,
Marcia

view message
Name Checker Pipeline
(1 response) marcia hon 2019-02-28 10:13

Hello,

We have demographics tables with participantIDs that must have the following format: AAA##AAA####### .

Is it possible to create a pipeline that will verify that these naming conventions are followed?

Thanks,
Marcia

view message

Welcome to the LabKey Support Forum. This forum is for questions regarding LabKey Server general usage, installation, or development.

Posting Questions

Before you post a new question, please search to see if someone has already asked a similar one.

Next, please review the Community Forum Guidelines.

When you post a question, please include the following information:

  • Your operating system.
  • Web browser.
  • Version number of LabKey Server.
  • A detailed description of your problem or question, including instructions for reproducing your issue.
  • Error information. Please attach log files to your message, rather than pasting in long text. Error pages and log files also include an Error Code. Please include this code in your message.

Additional Resources

User Account

In order to post to the community forum, you'll need to register for a user account.  If you already have an account but have forgotten your password, you can reset your password using the link on the Sign in page.