Skip to content

Commit

Permalink
Merge pull request #38 from ncsa/release-4.6.1
Browse files Browse the repository at this point in the history
Release 4.7.0
  • Loading branch information
navarroc authored Oct 31, 2024
2 parents b1f3d6c + d9b36af commit 1a4125a
Show file tree
Hide file tree
Showing 39 changed files with 373 additions and 112 deletions.
12 changes: 12 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,18 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/)
and this project adheres to [Semantic Versioning](http://semver.org/).

## [4.7.0] - 2024-10-29

### Added
- User information to environment of process running the tool [#28](https://github.com/ncsa/datawolf/issues/28)
- Ability for extra environment variables (used by IN-CORE) [#35](https://github.com/ncsa/datawolf/issues/35)

### Changed
- IN-CORE Dataset DAO and FileStorage implementation to use latest API [#29](https://github.com/ncsa/datawolf/issues/29)
- Kubernetes executor prints exception [#23](https://github.com/ncsa/datawolf/issues/23)
- Upgrade hsqldb to 2.7.3 [#27](https://github.com/ncsa/datawolf/issues/27)
- Custom properties to include more configuration variables [#33](https://github.com/ncsa/datawolf/issues/33)

## [4.6.0] - 2023-02-15

### Added
Expand Down
7 changes: 3 additions & 4 deletions charts/datawolf/Chart.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ version: 1.0.1
# This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using.
appVersion: 4.6.0
appVersion: 4.7.0

# List of people that maintain this helm chart.
maintainers:
Expand All @@ -38,6 +38,5 @@ dependencies:
# annotations for artifact.io
annotations:
artifacthub.io/changes: |
- add ability to set dataset permission (public/private)
- fix Chart.yaml
- fix ingress (deploy at /datawolf)
- User information to environment of process running the tool
- Ability for extra environment variables (used by IN-CORE)
3 changes: 3 additions & 0 deletions charts/datawolf/templates/deployment.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -94,6 +94,9 @@ spec:
value: {{ .Values.jobs.cpu | quote }}
- name: KUBERNETES_MEMORY
value: {{ .Values.jobs.memory | quote }}
{{- if .Values.extraEnvVars }}
{{ .Values.extraEnvVars | toYaml | nindent 12 }}
{{- end }}
volumeMounts:
- name: {{ include "datawolf.fullname" . }}
mountPath: /home/datawolf/data
Expand Down
2 changes: 2 additions & 0 deletions charts/datawolf/values.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,8 @@ jobs:
# default memory in GB per job
memory: 4.0

extraEnvVars: {}

serviceAccount:
# Specifies whether a service account should be created
create: true
Expand Down
2 changes: 1 addition & 1 deletion datawolf-core/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
<parent>
<groupId>edu.illinois.ncsa</groupId>
<artifactId>datawolf</artifactId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-core</artifactId>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,8 @@
public abstract class Executor {
private static Logger logger = LoggerFactory.getLogger(Executor.class);

protected static String DATAWOLF_USER = "DATAWOLF_USER";

private StringBuilder log = new StringBuilder();
private LogFile logfile = new LogFile();
private int lastsave = 0;
Expand Down
10 changes: 5 additions & 5 deletions datawolf-doc/doc/manual/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -247,18 +247,18 @@

https://opensource.ncsa.illinois.edu/projects/artifacts.php?key=WOLF

By default, the latest release is selected in the page (currently 4.6.0). To get early access to development releases, check the box **Show also prereleases.**
By default, the latest release is selected in the page (currently 4.7.0). To get early access to development releases, check the box **Show also prereleases.**

* Click on **Version**
* Select **4.6.0**
* Under **Files** select **datawolf-webapp-all-4.6.0-bin.zip**
* Select **4.7.0**
* Under **Files** select **datawolf-webapp-all-4.7.0-bin.zip**
* Click **I Accept** to accept the License.

This will give you the latest stable build that includes both the Data Wolf Server and the Web Editor. You can also find links to the javacode there as well as the manual. The link to the source code can be found at the end of this document.

### Installation and Setup

To install the files necessary for the Server and Editor, find where you downloaded Data Wolf and unzip it somewhere. This will create a folder called **datawolf-webapp-all-4.6.0**. In the next few sections, we'll discuss some of the important files that come with the installation you just unzipped so you can tailor your setup to meet your needs. If you wish to skip this, you can go directly to the section **Running Data Wolf Server and Editor**.
To install the files necessary for the Server and Editor, find where you downloaded Data Wolf and unzip it somewhere. This will create a folder called **datawolf-webapp-all-4.7.0**. In the next few sections, we'll discuss some of the important files that come with the installation you just unzipped so you can tailor your setup to meet your needs. If you wish to skip this, you can go directly to the section **Running Data Wolf Server and Editor**.

#### Data Wolf properties

Expand Down Expand Up @@ -443,7 +443,7 @@

#### Launch Scripts

If you go back to the folder **Data Wolf-webapp-all-4.6.0** you will see a sub-folder called **bin**, open this. Inside you will find two scripts, **datawolf-service** and **datawolf-service.bat**. The latter is intended for running Data Wolf on a Windows machine and the former is for running on Mac & Linux. As with the previous section, knowledge of this file is not required unless you are interested in configuring the Data Wolf Server and Editor beyond the default settings. We will show snippets of the file **datawolf-service** and discuss what each section is configuring.
If you go back to the folder **Data Wolf-webapp-all-4.7.0** you will see a sub-folder called **bin**, open this. Inside you will find two scripts, **datawolf-service** and **datawolf-service.bat**. The latter is intended for running Data Wolf on a Windows machine and the former is for running on Mac & Linux. As with the previous section, knowledge of this file is not required unless you are interested in configuring the Data Wolf Server and Editor beyond the default settings. We will show snippets of the file **datawolf-service** and discuss what each section is configuring.

```
# port for the jetty server
Expand Down
2 changes: 1 addition & 1 deletion datawolf-doc/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
<parent>
<groupId>edu.illinois.ncsa</groupId>
<artifactId>datawolf</artifactId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-doc</artifactId>
</project>
2 changes: 1 addition & 1 deletion datawolf-domain/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
<parent>
<groupId>edu.illinois.ncsa</groupId>
<artifactId>datawolf</artifactId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-domain</artifactId>

Expand Down
2 changes: 1 addition & 1 deletion datawolf-editor/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<groupId>edu.illinois.ncsa</groupId>
<artifactId>datawolf</artifactId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<packaging>war</packaging>
<artifactId>datawolf-editor</artifactId>
Expand Down
2 changes: 1 addition & 1 deletion datawolf-executor-commandline/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<artifactId>datawolf</artifactId>
<groupId>edu.illinois.ncsa</groupId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-executor-commandline</artifactId>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -108,6 +108,12 @@ public void execute(File cwd) throws AbortException, FailedException {
env.putAll(impl.getEnv());
}

// Add user to the environment in case a tool needs this information
if(execution.getCreator() != null) {
String user = execution.getCreator().getEmail();
env.put(DATAWOLF_USER, user);
}

// find the app to execute
command.add(findApp(impl.getExecutable().trim(), cwd));

Expand Down Expand Up @@ -170,9 +176,14 @@ public void execute(File cwd) throws AbortException, FailedException {
throw (new FailedException("Could not get input file.", e));
}
} else {

// Create a folder for the datasets
File inputFolder = new File(filename);
if (inputFolder.exists() && inputFolder.getAbsolutePath().startsWith(System.getProperty("java.io.tmpdir"))) {
// For single file, a tmp file got created above; however in this case, we need
// a temporary folder to store the files
inputFolder.delete();
}

if (!inputFolder.mkdirs()) {
throw (new FailedException("Could not create folder for input files"));
}
Expand Down Expand Up @@ -251,6 +262,7 @@ public void execute(File cwd) throws AbortException, FailedException {
sb.append(" ");
}
println("Executing : " + sb.toString());
logger.debug("Executing : " + sb.toString());

// create the process builder
ProcessBuilder pb = new ProcessBuilder(command);
Expand Down Expand Up @@ -369,11 +381,11 @@ public void execute(File cwd) throws AbortException, FailedException {
ds.setTitle(step.getTool().getOutput(impl.getCaptureStdOut()).getTitle());
ds.setCreator(execution.getCreator());

ds = datasetDao.save(ds);

ByteArrayInputStream bais = new ByteArrayInputStream(stdout.toString().getBytes("UTF-8"));
FileDescriptor fd = fileStorage.storeFile(step.getTool().getOutput(impl.getCaptureStdOut()).getTitle(), bais, execution.getCreator(), ds);

ds = datasetDao.save(ds);

execution.setDataset(step.getOutputs().get(impl.getCaptureStdOut()), ds.getId());
saveExecution = true;
} catch (IOException exc) {
Expand All @@ -385,11 +397,11 @@ public void execute(File cwd) throws AbortException, FailedException {
Dataset ds = new Dataset();
ds.setTitle(step.getTool().getOutput(impl.getCaptureStdErr()).getTitle());
ds.setCreator(execution.getCreator());
ds = datasetDao.save(ds);

ByteArrayInputStream bais = new ByteArrayInputStream(stderr.toString().getBytes("UTF-8"));
FileDescriptor fd = fileStorage.storeFile(step.getTool().getOutput(impl.getCaptureStdErr()).getTitle(), bais, execution.getCreator(), ds);

ds = datasetDao.save(ds);

execution.setDataset(step.getOutputs().get(impl.getCaptureStdErr()), ds.getId());
saveExecution = true;
Expand Down Expand Up @@ -419,15 +431,15 @@ public boolean accept(File pathname) {
for (File file : files) {
logger.debug("adding files to a dataset: " + file);
FileInputStream fis = new FileInputStream(file);
fileStorage.storeFile(file.getName(), fis, ds.getCreator(), ds);
fileStorage.storeFile(file.getName(), fis, execution.getCreator(), ds);
fis.close();
}

} else {
FileInputStream fis = new FileInputStream(entry.getValue());
fileStorage.storeFile(new File(entry.getValue()).getName(), fis, ds.getCreator(), ds);
fileStorage.storeFile(new File(entry.getValue()).getName(), fis, execution.getCreator(), ds);
}
ds = datasetDao.save(ds);
// ds = datasetDao.save(ds);

execution.setDataset(step.getOutputs().get(entry.getKey()), ds.getId());
saveExecution = true;
Expand Down
4 changes: 2 additions & 2 deletions datawolf-executor-hpc/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<artifactId>datawolf</artifactId>
<groupId>edu.illinois.ncsa</groupId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-executor-hpc</artifactId>

Expand All @@ -27,7 +27,7 @@
<dependency>
<groupId>com.jcraft</groupId>
<artifactId>jsch</artifactId>
<version>0.1.48</version>
<version>0.1.54</version>
</dependency>
</dependencies>
</project>
2 changes: 1 addition & 1 deletion datawolf-executor-java-tool/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<artifactId>datawolf</artifactId>
<groupId>edu.illinois.ncsa</groupId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-executor-java-tool</artifactId>

Expand Down
2 changes: 1 addition & 1 deletion datawolf-executor-java/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
<parent>
<artifactId>datawolf</artifactId>
<groupId>edu.illinois.ncsa</groupId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-executor-java</artifactId>

Expand Down
2 changes: 1 addition & 1 deletion datawolf-executor-kubernetes/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
<parent>
<groupId>edu.illinois.ncsa</groupId>
<artifactId>datawolf</artifactId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-executor-kubernetes</artifactId>
<dependencies>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -159,10 +159,17 @@ public State submitRemoteJob(File cwd) throws AbortException, FailedException {

// Create a folder for the datasets
File inputFolder = new File(filename);
if (inputFolder.exists()) {
// For single file, a tmp file got created above; however in this case, we need
// a temporary folder to store the files
inputFolder.delete();
}

if (!inputFolder.mkdirs()) {
throw (new FailedException("Could not create folder for input files"));
}


int duplicate = 1;
for (FileDescriptor fd : ds.getFileDescriptors()) {
String localFileName = fd.getFilename();
Expand Down Expand Up @@ -285,8 +292,25 @@ public State submitRemoteJob(File cwd) throws AbortException, FailedException {
container.args(command);
// add any environment variables
if (!impl.getEnv().isEmpty()) {
// TODO implement
//container.addEnvItem();
Map<String, String> environment = impl.getEnv();

for (Map.Entry<String, String> entry : environment.entrySet()) {
String key = entry.getKey();
String value = entry.getValue();
V1EnvVar envVar = new V1EnvVar();
envVar.setName(key);
envVar.setValue(value);
container.addEnvItem(envVar);
}
}

// Add user to the environment in case a tool needs this information
if(execution.getCreator() != null) {
String user = execution.getCreator().getEmail();
V1EnvVar envVar = new V1EnvVar();
envVar.setName(DATAWOLF_USER);
envVar.setValue(user);
container.addEnvItem(envVar);
}

// add resource limits
Expand Down Expand Up @@ -318,7 +342,7 @@ public State submitRemoteJob(File cwd) throws AbortException, FailedException {
throw e;
} catch (FailedException e) {
// Job could not be submitted, set state to waiting to try again
logger.info("Job not submitted because the job scheduler appears to be down, will try again shortly...");
logger.info("Job not submitted because the job scheduler appears to be down, will try again shortly...", e);
return State.WAITING;
// throw e;
} catch (Throwable e) {
Expand Down Expand Up @@ -384,12 +408,11 @@ public State checkRemoteJob() throws FailedException {
Dataset ds = new Dataset();
ds.setTitle(step.getTool().getOutput(impl.getCaptureStdOut()).getTitle());
ds.setCreator(execution.getCreator());
ds = datasetDao.save(ds);

ByteArrayInputStream bais = new ByteArrayInputStream(lastlog.getBytes("UTF-8"));
FileDescriptor fd = fileStorage.storeFile(step.getTool().getOutput(impl.getCaptureStdOut()).getTitle(), bais, execution.getCreator(), ds);

ds = datasetDao.save(ds);

execution.setDataset(step.getOutputs().get(impl.getCaptureStdOut()), ds.getId());
saveExecution = true;
}
Expand Down Expand Up @@ -419,15 +442,15 @@ public boolean accept(File pathname) {
for (File file : files) {
logger.debug("adding files to a dataset: " + file);
FileInputStream fis = new FileInputStream(file);
fileStorage.storeFile(file.getName(), fis, ds.getCreator(), ds);
fileStorage.storeFile(file.getName(), fis, execution.getCreator(), ds);
fis.close();
}

} else {
FileInputStream fis = new FileInputStream(entry.getValue());
fileStorage.storeFile(new File(entry.getValue()).getName(), fis, ds.getCreator(), ds);
fileStorage.storeFile(new File(entry.getValue()).getName(), fis, execution.getCreator(), ds);
}
ds = datasetDao.save(ds);
// ds = datasetDao.save(ds);

execution.setDataset(step.getOutputs().get(entry.getKey()), ds.getId());
saveExecution = true;
Expand Down
2 changes: 1 addition & 1 deletion datawolf-jpa/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<artifactId>datawolf</artifactId>
<groupId>edu.illinois.ncsa</groupId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-jpa</artifactId>
<packaging>jar</packaging>
Expand Down
2 changes: 1 addition & 1 deletion datawolf-provenance/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<groupId>edu.illinois.ncsa</groupId>
<artifactId>datawolf</artifactId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<packaging>war</packaging>
<artifactId>datawolf-provenance</artifactId>
Expand Down
2 changes: 1 addition & 1 deletion datawolf-service-client/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<groupId>edu.illinois.ncsa</groupId>
<artifactId>datawolf</artifactId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-service-client</artifactId>

Expand Down
2 changes: 1 addition & 1 deletion datawolf-service/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<artifactId>datawolf</artifactId>
<groupId>edu.illinois.ncsa</groupId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-service</artifactId>
<dependencies>
Expand Down
Loading

0 comments on commit 1a4125a

Please sign in to comment.