diff --git a/README.md b/README.md index e69de29b..79fa4cf1 100644 --- a/README.md +++ b/README.md @@ -0,0 +1,96 @@ + + + +# Questions and Answers # + +In this section we will try to cover DataForge main ideas in the form of questions and answers. + +## General ## + +**Q:** I have a lot of data to analyze. The analysis process is complicated, requires a lot of stages and data flow is not always obvious. To top it the data size is huge, so I don't want to perform operation I don't need (calculate something I won't need or calculate something twice). And yes, I need it to be performed in parallel and probably on remote computer. By the way, I am sick and tired of scripts that modify other scripts that control scripts. Could you help me? + +**A:** Yes, that is the precisely the problem DataForge was made to solve. It allows to perform some automated data manipulations with automatic optimization and parallelization. The important thing that data processing recipes are made in the declarative way, so it is quite easy to perform computations on a remote station. Also DataForge guarantees reproducibility of analysis results. +
+ +**Q:** How does it work? + +**A:** At the core of DataForge lies the idea of **metadata processor**. It utilizes the statement that in order to analyze something you need data itself and some additional information about what does that data represent and what does user want as a result. This additional information is called metadata and could be organized in a regular structure (a tree of values not unlike XML or JSON). The important thing is that this distinction leaves no place for user instructions (or scripts). Indeed, the idea of DataForge logic is that one do not need imperative commands. The framework configures itself according to input meta-data and decides what operations should be performed in the most efficient way. +
+ +**Q:** But where does it take algorithms to use? + +**A:** Of course algorithms must be written somewhere. No magic here. The logic is written in specialized modules. Some modules are provided out of the box at the system core, some need to be developed for specific problem. +
+ +**Q:** So I still need to write the code? What is the difference then? + +**A:** Yes, someone still need to write the code. But not necessary you. Simple operations could be performed using provided core logic. Also your group can have one programmer writing the logic and all other using it without any real programming expertise. Also the framework organized in a such way that one writes some additional logic, he do not need to thing about complicated thing like parallel computing, resource handling, logging, caching etc. Most of the things are done by the DataForge. +
+ +## Platform ## + +**Q:** Which platform does DataForge use? Which operation system is it working on? + +**A:** The DataForge is mostly written in Java and utilizes JVM as a platform. It works on any system that supports JVM (meaning almost any modern system excluding some mobile platforms). +
+ + **Q:** But Java... it is slow! + + **A:** [It is not](https://stackoverflow.com/questions/2163411/is-java-really-slow/2163570#2163570). It lacks some hardware specific optimizations and requires some additional time to start (due to JIT nature), but otherwise it is at least as fast as other languages traditionally used in science. More importantly, the memory safety, tooling support and vast ecosystem makes it №1 candidate for data analysis framework. + +
+ + **Q:** Can I use my C++/Fortran/Python code in DataForge? + + **A:** Yes, as long as the code could be called from Java. Most of common languages have a bridge for Java access. There are completely no problems with compiled C/Fortran libraries. Python code could be called via one of existing python-java interfaces. It is also planned to implement remote method invocation for common languages, so your Python, or, say, Julia, code could run in its native environment. The metadata processor paradigm makes it much easier to do so. + +
+ +## Features ## + +**Q:** What other features does DataForge provide? + +**A:** Alongside metadata processing (and a lot of tools for metadata manipulation and layering), DataForge has two additional important concepts: + +* **Modularisation**. Contrary to lot other frameworks, DataForge is intrinsically modular. The mandatory part is a rather tiny core module. Everything else could be customized. + +* **Context encapsulation**. Every DataForge task is executed in some context. The context isolates environment for the task and also works as dependency injection base and specifies interaction of the task with the external world. + + +
+ +**Q:** OK, but now I want to work directly with my measuring devices. How can I do that? + +**A:** The [dataforge-control](${site.url}/docs.html#control) module provides interfaces to interact with the hardware. Out of the box it supports safe communication with TCP/IP or COM/tty based devices. Specific device declaration could be done via additional modules. It is also possible to maintain data storage with [datforge-storage](${site.url}/docs.htm#storage) module. + +
+ +**Q:** Declarations and metadata are good, but I want my scripts back! + +**A:** We can do that. [GRIND](${site.url}/docs.html#grind) provides a shell-like environment called GrindShell. It allows to run imperative scripts with full access to all of the DataForge functionality. Grind scripts are basically context-encapsulated. Also there are convenient feature wrappers called helpers that could be loaded into the shell when new features modules are added. + +
+ +## Misc ## + +**Q:** So everything looks great, can I replace my ROOT / other data analysis framework with DataForge? + +**A:** One must note, that DataForge is made for analysis, not for visualisation. The visualisation and user interaction capabilities of DataForge are rather limited compared to frameworks like ROOT, JAS3 or DataMelt. The idea is to provide reliable API and core functionality. In fact JAS3 and DataMelt could be used as a frontend for DataForge mechanics. It is planned to add an interface to ROOT via JFreeHep AIDA. + +
+ +**Q:** How does DataForge compare to cluster computation frameworks like Hadoop or Spark? + +**A:** Again, it is not the purpose of DataForge to replace cluster software. DataForge has some internal parallelism mechanics and implementations, but they are most certainly worse then specially developed programs. Still, DataForge is not fixed on one single implementation. Your favourite parallel processing tool could be still used as a back-end for the DataForge. With full benefit of configuration tools, integrations and no performance overhead. + +
+ +**Q:** Is it possible to use DataForge in notebook mode? + +**A:** Yes, it is. DataForge can be used as is from [beaker/beakerx](http://beakernotebook.com/) groovy kernel with minor additional adjustments. It is planned to provide separate DataForge kernel to `beakerx` which will automatically call a specific GRIND shell. + +
+ +**Q:** Can I use DataForge on a mobile platform? + +**A:** DataForge is modular. Core and the most of api are pretty compact, so it could be used in Android applications. Some modules are designed for PC and could not be used on other platforms. IPhone does not support Java and therefore could use only client-side DataForge applications. diff --git a/build.gradle.kts b/build.gradle.kts index 8494144e..016a350e 100644 --- a/build.gradle.kts +++ b/build.gradle.kts @@ -1,18 +1,20 @@ -val dataforgeVersion by extra("0.1.2") +plugins { + id("scientifik.mpp") version "0.1.4" apply false + id("scientifik.publish") version "0.1.4" apply false +} -allprojects { - repositories { - jcenter() - maven("https://kotlin.bintray.com/kotlinx") - } +val dataforgeVersion by extra("0.1.3") +val bintrayRepo by extra("dataforge") +val githubProject by extra("dataforge-core") + +allprojects { group = "hep.dataforge" version = dataforgeVersion } subprojects { if (name.startsWith("dataforge")) { - apply(plugin = "npm-bintray") - apply(plugin = "npm-artifactory") + apply(plugin = "scientifik.publish") } } \ No newline at end of file diff --git a/buildSrc/build.gradle.kts b/buildSrc/build.gradle.kts deleted file mode 100644 index 1ebbdf4d..00000000 --- a/buildSrc/build.gradle.kts +++ /dev/null @@ -1,20 +0,0 @@ -plugins { - `kotlin-dsl` -} - -repositories { - gradlePluginPortal() - jcenter() -} - -val kotlinVersion = "1.3.31" - -// Add plugins used in buildSrc as dependencies, also we should specify version only here -dependencies { - implementation("org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlinVersion") - implementation("org.jfrog.buildinfo:build-info-extractor-gradle:4.9.5") - implementation("com.jfrog.bintray.gradle:gradle-bintray-plugin:1.8.4") - implementation("org.jetbrains.dokka:dokka-gradle-plugin:0.9.18") - implementation("com.moowork.gradle:gradle-node-plugin:1.3.1") - implementation("org.openjfx:javafx-plugin:0.0.7") -} diff --git a/buildSrc/settings.gradle.kts b/buildSrc/settings.gradle.kts deleted file mode 100644 index e69de29b..00000000 diff --git a/buildSrc/src/main/kotlin/Versions.kt b/buildSrc/src/main/kotlin/Versions.kt deleted file mode 100644 index 883af120..00000000 --- a/buildSrc/src/main/kotlin/Versions.kt +++ /dev/null @@ -1,9 +0,0 @@ -// Instead of defining runtime properties and use them dynamically -// define version in buildSrc and have autocompletion and compile-time check -// Also dependencies itself can be moved here -object Versions { - val ioVersion = "0.1.8" - val coroutinesVersion = "1.2.1" - val atomicfuVersion = "0.12.6" - val serializationVersion = "0.11.0" -} diff --git a/buildSrc/src/main/kotlin/dokka-publish.gradle.kts b/buildSrc/src/main/kotlin/dokka-publish.gradle.kts deleted file mode 100644 index 318e08ae..00000000 --- a/buildSrc/src/main/kotlin/dokka-publish.gradle.kts +++ /dev/null @@ -1,59 +0,0 @@ -import org.jetbrains.dokka.gradle.DokkaTask - -plugins { - kotlin("multiplatform") - id("org.jetbrains.dokka") - `maven-publish` -} - -kotlin { - - val dokka by tasks.getting(DokkaTask::class) { - outputFormat = "html" - outputDirectory = "$buildDir/javadoc" - jdkVersion = 8 - - kotlinTasks { - // dokka fails to retrieve sources from MPP-tasks so we only define the jvm task - listOf(tasks.getByPath("compileKotlinJvm")) - } - sourceRoot { - // assuming only single source dir - path = sourceSets["commonMain"].kotlin.srcDirs.first().toString() - platforms = listOf("Common") - } - // although the JVM sources are now taken from the task, - // we still define the jvm source root to get the JVM marker in the generated html - sourceRoot { - // assuming only single source dir - path = sourceSets["jvmMain"].kotlin.srcDirs.first().toString() - platforms = listOf("JVM") - } - } - - val javadocJar by tasks.registering(Jar::class) { - dependsOn(dokka) - archiveClassifier.set("javadoc") - from("$buildDir/javadoc") - } - - publishing { - - // publications.filterIsInstance().forEach { publication -> -// if (publication.name == "kotlinMultiplatform") { -// // for our root metadata publication, set artifactId with a package and project name -// publication.artifactId = project.name -// } else { -// // for targets, set artifactId with a package, project name and target name (e.g. iosX64) -// publication.artifactId = "${project.name}-${publication.name}" -// } -// } - - targets.all { - val publication = publications.findByName(name) as MavenPublication - - // Patch publications with fake javadoc - publication.artifact(javadocJar.get()) - } - } -} \ No newline at end of file diff --git a/buildSrc/src/main/kotlin/js-test.gradle.kts b/buildSrc/src/main/kotlin/js-test.gradle.kts deleted file mode 100644 index 61759a28..00000000 --- a/buildSrc/src/main/kotlin/js-test.gradle.kts +++ /dev/null @@ -1,44 +0,0 @@ -import com.moowork.gradle.node.npm.NpmTask -import com.moowork.gradle.node.task.NodeTask -import org.gradle.kotlin.dsl.* -import org.jetbrains.kotlin.gradle.tasks.Kotlin2JsCompile - -plugins { - id("com.moowork.node") - kotlin("multiplatform") -} - -node { - nodeModulesDir = file("$buildDir/node_modules") -} - -val compileKotlinJs by tasks.getting(Kotlin2JsCompile::class) -val compileTestKotlinJs by tasks.getting(Kotlin2JsCompile::class) - -val populateNodeModules by tasks.registering(Copy::class) { - dependsOn(compileKotlinJs) - from(compileKotlinJs.destinationDir) - - kotlin.js().compilations["test"].runtimeDependencyFiles.forEach { - if (it.exists() && !it.isDirectory) { - from(zipTree(it.absolutePath).matching { include("*.js") }) - } - } - - into("$buildDir/node_modules") -} - -val installMocha by tasks.registering(NpmTask::class) { - setWorkingDir(buildDir) - setArgs(listOf("install", "mocha")) -} - -val runMocha by tasks.registering(NodeTask::class) { - dependsOn(compileTestKotlinJs, populateNodeModules, installMocha) - setScript(file("$buildDir/node_modules/mocha/bin/mocha")) - setArgs(listOf(compileTestKotlinJs.outputFile)) -} - -tasks["jsTest"].dependsOn(runMocha) - - diff --git a/buildSrc/src/main/kotlin/npm-artifactory.gradle.kts b/buildSrc/src/main/kotlin/npm-artifactory.gradle.kts deleted file mode 100644 index d792dffb..00000000 --- a/buildSrc/src/main/kotlin/npm-artifactory.gradle.kts +++ /dev/null @@ -1,38 +0,0 @@ -import groovy.lang.GroovyObject -import org.jfrog.gradle.plugin.artifactory.dsl.PublisherConfig -import org.jfrog.gradle.plugin.artifactory.dsl.ResolverConfig - -plugins { - id("com.jfrog.artifactory") -} - -artifactory { - val artifactoryUser: String? by project - val artifactoryPassword: String? by project - val artifactoryContextUrl = "http://npm.mipt.ru:8081/artifactory" - - setContextUrl(artifactoryContextUrl)//The base Artifactory URL if not overridden by the publisher/resolver - publish(delegateClosureOf { - repository(delegateClosureOf { - setProperty("repoKey", "gradle-dev-local") - setProperty("username", artifactoryUser) - setProperty("password", artifactoryPassword) - }) - - defaults(delegateClosureOf{ - invokeMethod("publications", arrayOf("jvm", "js", "kotlinMultiplatform", "metadata")) - //TODO: This property is not available for ArtifactoryTask - //setProperty("publishBuildInfo", false) - setProperty("publishArtifacts", true) - setProperty("publishPom", true) - setProperty("publishIvy", false) - }) - }) - resolve(delegateClosureOf { - repository(delegateClosureOf { - setProperty("repoKey", "gradle-dev") - setProperty("username", artifactoryUser) - setProperty("password", artifactoryPassword) - }) - }) -} diff --git a/buildSrc/src/main/kotlin/npm-bintray.gradle.kts b/buildSrc/src/main/kotlin/npm-bintray.gradle.kts deleted file mode 100644 index b152d163..00000000 --- a/buildSrc/src/main/kotlin/npm-bintray.gradle.kts +++ /dev/null @@ -1,97 +0,0 @@ -@file:Suppress("UnstableApiUsage") - -import com.jfrog.bintray.gradle.BintrayExtension.PackageConfig -import com.jfrog.bintray.gradle.BintrayExtension.VersionConfig - -// Old bintray.gradle script converted to real Gradle plugin (precompiled script plugin) -// It now has own dependencies and support type safe accessors -// Syntax is pretty close to what we had in Groovy -// (excluding Property.set and bintray dynamic configs) - -plugins { - id("com.jfrog.bintray") - `maven-publish` -} - -val vcs = "https://github.com/mipt-npm/kmath" - -// Configure publishing -publishing { - repositories { - maven("https://bintray.com/mipt-npm/scientifik") - } - - // Process each publication we have in this project - publications.filterIsInstance().forEach { publication -> - - // use type safe pom config GSL insterad of old dynamic - publication.pom { - name.set(project.name) - description.set(project.description) - url.set(vcs) - - licenses { - license { - name.set("The Apache Software License, Version 2.0") - url.set("http://www.apache.org/licenses/LICENSE-2.0.txt") - distribution.set("repo") - } - } - developers { - developer { - id.set("MIPT-NPM") - name.set("MIPT nuclear physics methods laboratory") - organization.set("MIPT") - organizationUrl.set("http://npm.mipt.ru") - } - - } - scm { - url.set(vcs) - } - } - - } -} - -bintray { - // delegates for runtime properties - val bintrayUser: String? by project - val bintrayApiKey: String? by project - user = bintrayUser ?: System.getenv("BINTRAY_USER") - key = bintrayApiKey ?: System.getenv("BINTRAY_API_KEY") - publish = true - override = true // for multi-platform Kotlin/Native publishing - - // We have to use delegateClosureOf because bintray supports only dynamic groovy syntax - // this is a problem of this plugin - pkg(delegateClosureOf { - userOrg = "mipt-npm" - repo = "scientifik" - name = "scientifik.kmath" - issueTrackerUrl = "https://github.com/mipt-npm/kmath/issues" - setLicenses("Apache-2.0") - vcsUrl = vcs - version(delegateClosureOf { - name = project.version.toString() - vcsTag = project.version.toString() - released = java.util.Date().toString() - }) - }) - - tasks { - bintrayUpload { - dependsOn(publishToMavenLocal) - doFirst { - setPublications(project.publishing.publications - .filterIsInstance() - .filter { !it.name.contains("-test") && it.name != "kotlinMultiplatform" } - .map { - println("""Uploading artifact "${it.groupId}:${it.artifactId}:${it.version}" from publication "${it.name}""") - it.name //https://github.com/bintray/gradle-bintray-plugin/issues/256 - }) - } - } - - } -} diff --git a/buildSrc/src/main/kotlin/npm-multiplatform.gradle.kts b/buildSrc/src/main/kotlin/npm-multiplatform.gradle.kts deleted file mode 100644 index 671986b8..00000000 --- a/buildSrc/src/main/kotlin/npm-multiplatform.gradle.kts +++ /dev/null @@ -1,86 +0,0 @@ -import org.gradle.kotlin.dsl.* - -plugins { - kotlin("multiplatform") - `maven-publish` -} - - -kotlin { - jvm { - compilations.all { - kotlinOptions { - jvmTarget = "1.8" - } - } - } - - js { - compilations.all { - kotlinOptions { - metaInfo = true - sourceMap = true - sourceMapEmbedSources = "always" - moduleKind = "commonjs" - } - } - - compilations.named("main") { - kotlinOptions { - main = "call" - } - } - } - - sourceSets { - val commonMain by getting { - dependencies { - api(kotlin("stdlib")) - } - } - val commonTest by getting { - dependencies { - implementation(kotlin("test-common")) - implementation(kotlin("test-annotations-common")) - } - } - val jvmMain by getting { - dependencies { - api(kotlin("stdlib-jdk8")) - } - } - val jvmTest by getting { - dependencies { - implementation(kotlin("test")) - implementation(kotlin("test-junit")) - } - } - val jsMain by getting { - dependencies { - api(kotlin("stdlib-js")) - } - } - val jsTest by getting { - dependencies { - implementation(kotlin("test-js")) - } - } - } - - targets.all { - sourceSets.all { - languageSettings.progressiveMode = true - languageSettings.enableLanguageFeature("InlineClasses") - } - } - - apply(plugin = "dokka-publish") - - // Apply JS test configuration - val runJsTests by ext(false) - - if (runJsTests) { - apply(plugin = "js-test") - } - -} diff --git a/dataforge-context/build.gradle.kts b/dataforge-context/build.gradle.kts index f454e2ef..896e7b89 100644 --- a/dataforge-context/build.gradle.kts +++ b/dataforge-context/build.gradle.kts @@ -1,34 +1,31 @@ plugins { - `npm-multiplatform` + id("scientifik.mpp") } description = "Context and provider definitions" -val coroutinesVersion: String = Versions.coroutinesVersion +val coroutinesVersion: String = Scientifik.coroutinesVersion kotlin { - jvm() - js() - sourceSets { val commonMain by getting { dependencies { api(project(":dataforge-meta")) api(kotlin("reflect")) - api("io.github.microutils:kotlin-logging-common:1.6.10") + api("io.github.microutils:kotlin-logging-common:1.7.2") api("org.jetbrains.kotlinx:kotlinx-coroutines-core-common:$coroutinesVersion") } } val jvmMain by getting { dependencies { - api("io.github.microutils:kotlin-logging:1.6.10") + api("io.github.microutils:kotlin-logging:1.7.2") api("ch.qos.logback:logback-classic:1.2.3") api("org.jetbrains.kotlinx:kotlinx-coroutines-core:$coroutinesVersion") } } val jsMain by getting { dependencies { - api("io.github.microutils:kotlin-logging-js:1.6.10") + api("io.github.microutils:kotlin-logging-js:1.7.2") api("org.jetbrains.kotlinx:kotlinx-coroutines-core-js:$coroutinesVersion") } } diff --git a/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/AbstractPlugin.kt b/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/AbstractPlugin.kt index 9a091ad1..c9268790 100644 --- a/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/AbstractPlugin.kt +++ b/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/AbstractPlugin.kt @@ -3,6 +3,7 @@ package hep.dataforge.context import hep.dataforge.meta.EmptyMeta import hep.dataforge.meta.Meta import hep.dataforge.names.Name +import hep.dataforge.names.toName abstract class AbstractPlugin(override val meta: Meta = EmptyMeta) : Plugin { private var _context: Context? = null @@ -18,7 +19,9 @@ abstract class AbstractPlugin(override val meta: Meta = EmptyMeta) : Plugin { this._context = null } - override fun provideTop(target: String, name: Name): Any? = null + override fun provideTop(target: String): Map = emptyMap() - override fun listNames(target: String): Sequence = emptySequence() + companion object{ + fun Collection.toMap(): Map = associate { it.name.toName() to it } + } } \ No newline at end of file diff --git a/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/Context.kt b/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/Context.kt index cf6bb662..80746456 100644 --- a/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/Context.kt +++ b/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/Context.kt @@ -26,8 +26,10 @@ import kotlin.jvm.JvmName * Since plugins could contain mutable state, context has two states: active and inactive. No changes are allowed to active context. * @author Alexander Nozik */ -open class Context(final override val name: String, val parent: Context? = Global) : Named, MetaRepr, Provider, - CoroutineScope { +open class Context( + final override val name: String, + val parent: Context? = Global +) : Named, MetaRepr, Provider, CoroutineScope { private val config = Config() @@ -59,19 +61,11 @@ open class Context(final override val name: String, val parent: Context? = Globa override val defaultTarget: String get() = Plugin.PLUGIN_TARGET - override fun provideTop(target: String, name: Name): Any? { + override fun provideTop(target: String): Map { return when (target) { - Plugin.PLUGIN_TARGET -> plugins[PluginTag.fromString(name.toString())] - Value.TYPE -> properties[name]?.value - else -> null - } - } - - override fun listNames(target: String): Sequence { - return when (target) { - Plugin.PLUGIN_TARGET -> plugins.asSequence().map { it.name.toName() } - Value.TYPE -> properties.values().map { it.first } - else -> emptySequence() + Value.TYPE -> properties.sequence().toMap() + Plugin.PLUGIN_TARGET -> plugins.sequence(true).associateBy { it.name.toName() } + else -> emptyMap() } } @@ -116,11 +110,11 @@ open class Context(final override val name: String, val parent: Context? = Globa } } -/** - * A sequences of all objects provided by plugins with given target and type - */ fun Context.content(target: String): Map = content(target) +/** + * A map of all objects provided by plugins with given target and type + */ @JvmName("typedContent") inline fun Context.content(target: String): Map = plugins.flatMap { plugin -> @@ -148,14 +142,18 @@ object Global : Context("GLOBAL", null) { private val contextRegistry = HashMap() /** - * Get previously builder context o builder a new one + * Get previously built context * * @param name * @return */ - fun getContext(name: String): Context { - return contextRegistry.getOrPut(name) { Context(name) } + fun getContext(name: String): Context? { + return contextRegistry[name] } + + fun context(name: String, parent: Context = this, block: ContextBuilder.() -> Unit = {}): Context = + ContextBuilder(name, parent).apply(block).build() + } diff --git a/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/ContextBuilder.kt b/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/ContextBuilder.kt index 203edd2c..92840862 100644 --- a/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/ContextBuilder.kt +++ b/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/ContextBuilder.kt @@ -18,11 +18,15 @@ class ContextBuilder(var name: String = "@anonimous", val parent: Context = Glob plugins.add(plugin) } - fun plugin(tag: PluginTag, action: MetaBuilder.() -> Unit) { + fun plugin(tag: PluginTag, action: MetaBuilder.() -> Unit = {}) { plugins.add(PluginRepository.fetch(tag, buildMeta(action))) } - fun plugin(name: String, group: String = "", version: String = "", action: MetaBuilder.() -> Unit) { + fun plugin(builder: PluginFactory<*>, action: MetaBuilder.() -> Unit = {}) { + plugins.add(builder.invoke(buildMeta(action))) + } + + fun plugin(name: String, group: String = "", version: String = "", action: MetaBuilder.() -> Unit = {}) { plugin(PluginTag(name, group, version), action) } diff --git a/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/PluginManager.kt b/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/PluginManager.kt index bcce8eb6..dad483a8 100644 --- a/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/PluginManager.kt +++ b/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/PluginManager.kt @@ -51,7 +51,10 @@ class PluginManager(override val context: Context) : ContextAware, Iterable get(type: KClass, recursive: Boolean = true): T? = - get(recursive) { type.isInstance(it) } as T? + operator fun get(type: KClass, tag: PluginTag? = null, recursive: Boolean = true): T? = + get(recursive) { type.isInstance(it) && (tag == null || tag.matches(it.tag)) } as T? - inline fun get(recursive: Boolean = true): T? = get(T::class, recursive) + inline fun get(tag: PluginTag? = null, recursive: Boolean = true): T? = + get(T::class, tag, recursive) /** * Load given plugin into this manager and return loaded instance. - * Throw error if plugin of the same class already exists in manager + * Throw error if plugin of the same type and tag already exists in manager. * * @param plugin * @return @@ -75,10 +79,12 @@ class PluginManager(override val context: Context) : ContextAware, Iterable load(plugin: T): T { if (context.isActive) error("Can't load plugin into active context") - if (get(plugin::class, false) != null) { - throw RuntimeException("Plugin of type ${plugin::class} already exists in ${context.name}") + if (get(plugin::class, plugin.tag, recursive = false) != null) { + error("Plugin of type ${plugin::class} already exists in ${context.name}") } else { - loadDependencies(plugin) + for (tag in plugin.dependsOn()) { + fetch(tag, true) + } logger.info { "Loading plugin ${plugin.name} into ${context.name}" } plugin.attach(context) @@ -87,11 +93,14 @@ class PluginManager(override val context: Context) : ContextAware, Iterable load(factory: PluginFactory, meta: Meta = EmptyMeta): T = + load(factory(meta)) + + fun load(factory: PluginFactory, metaBuilder: MetaBuilder.() -> Unit): T = + load(factory, buildMeta(metaBuilder)) /** * Remove a plugin from [PluginManager] @@ -107,22 +116,11 @@ class PluginManager(override val context: Context) : ContextAware, Iterable load(PluginRepository.fetch(tag,meta)) - loaded.meta == meta -> loaded // if meta is the same, return existing plugin - else -> throw RuntimeException("Can't load plugin with tag $tag. Plugin with this tag and different configuration already exists in context.") - } - } - - fun load(factory: PluginFactory<*>, meta: Meta = EmptyMeta): Plugin{ - val loaded = get(factory.tag, false) + fun fetch(factory: PluginFactory, recursive: Boolean = true, meta: Meta = EmptyMeta): T { + val loaded = get(factory.type, factory.tag, recursive) return when { loaded == null -> load(factory(meta)) loaded.meta == meta -> loaded // if meta is the same, return existing plugin @@ -130,42 +128,12 @@ class PluginManager(override val context: Context) : ContextAware, Iterable load(type: KClass, meta: Meta = EmptyMeta): T { - val loaded = get(type, false) - return when { - loaded == null -> { - val plugin = PluginRepository.list().first { it.type == type }.invoke(meta) - if (type.isInstance(plugin)) { - @Suppress("UNCHECKED_CAST") - load(plugin as T) - } else { - error("Corrupt type information in plugin repository") - } - } - loaded.meta == meta -> loaded // if meta is the same, return existing plugin - else -> throw RuntimeException("Can't load plugin with type $type. Plugin with this type and different configuration already exists in context.") - } - } - - inline fun load(noinline metaBuilder: MetaBuilder.() -> Unit = {}): T { - return load(T::class, buildMeta(metaBuilder)) - } - - fun load(name: String, meta: Meta = EmptyMeta): Plugin { - return load(PluginTag.fromString(name), meta) - } + fun fetch( + factory: PluginFactory, + recursive: Boolean = true, + metaBuilder: MetaBuilder.() -> Unit + ): T = fetch(factory, recursive, buildMeta(metaBuilder)) override fun iterator(): Iterator = plugins.iterator() - /** - * Get a plugin if it exists or load it with given meta if it is not. - */ - inline fun getOrLoad(noinline metaBuilder: MetaBuilder.() -> Unit = {}): T { - return get(true) ?: load(metaBuilder) - } - } \ No newline at end of file diff --git a/dataforge-context/src/commonMain/kotlin/hep/dataforge/provider/Provider.kt b/dataforge-context/src/commonMain/kotlin/hep/dataforge/provider/Provider.kt index 657f272d..b1d769e2 100644 --- a/dataforge-context/src/commonMain/kotlin/hep/dataforge/provider/Provider.kt +++ b/dataforge-context/src/commonMain/kotlin/hep/dataforge/provider/Provider.kt @@ -17,7 +17,6 @@ package hep.dataforge.provider import hep.dataforge.names.Name import hep.dataforge.names.toName -import kotlin.jvm.JvmName /** * A marker utility interface for providers. @@ -42,29 +41,21 @@ interface Provider { /** - * Provide a top level element for this [Provider] or return null if element is not present + * A map of direct children for specific target */ - fun provideTop(target: String, name: Name): Any? - - /** - * [Sequence] of available names with given target. Only top level names are listed, no chain path. - * - * @param target - * @return - */ - fun listNames(target: String): Sequence + fun provideTop(target: String): Map } fun Provider.provide(path: Path, targetOverride: String? = null): Any? { if (path.length == 0) throw IllegalArgumentException("Can't provide by empty path") val first = path.first() - val top = provideTop(targetOverride ?: first.target ?: defaultTarget, first.name) + val target = targetOverride ?: first.target ?: defaultTarget + val res = provideTop(target)[first.name] ?: return null return when (path.length) { - 1 -> top + 1 -> res else -> { - when (top) { - null -> null - is Provider -> top.provide(path.tail!!, targetOverride = defaultChainTarget) + when (res) { + is Provider -> res.provide(path.tail!!, targetOverride = defaultChainTarget) else -> throw IllegalStateException("Chain path not supported: child is not a provider") } } @@ -86,14 +77,11 @@ inline fun Provider.provide(target: String, name: String): T? provide(target, name.toName()) /** - * A top level content with names + * Typed top level content */ -fun Provider.top(target: String): Map = top(target) - -@JvmName("typedTop") inline fun Provider.top(target: String): Map { - return listNames(target).associate { - it to (provideTop(target, it) as? T ?: error("The element $it is declared but not provided")) + return provideTop(target).mapValues { + it.value as? T ?: error("The type of element $it is ${it::class} but ${T::class} is expected") } } diff --git a/dataforge-context/src/commonTest/kotlin/hep/dataforge/context/ContextTest.kt b/dataforge-context/src/commonTest/kotlin/hep/dataforge/context/ContextTest.kt index 1d37c69e..c77439d6 100644 --- a/dataforge-context/src/commonTest/kotlin/hep/dataforge/context/ContextTest.kt +++ b/dataforge-context/src/commonTest/kotlin/hep/dataforge/context/ContextTest.kt @@ -12,17 +12,10 @@ class ContextTest { class DummyPlugin : AbstractPlugin() { override val tag get() = PluginTag("test") - override fun provideTop(target: String, name: Name): Any? { - return when (target) { - "test" -> return name - else -> super.provideTop(target, name) - } - } - - override fun listNames(target: String): Sequence { - return when (target) { - "test" -> sequenceOf("a", "b", "c.d").map { it.toName() } - else -> super.listNames(target) + override fun provideTop(target: String): Map { + return when(target){ + "test" -> listOf("a", "b", "c.d").associate { it.toName() to it.toName() } + else -> emptyMap() } } } diff --git a/dataforge-context/src/jvmMain/kotlin/hep/dataforge/provider/Types.kt b/dataforge-context/src/jvmMain/kotlin/hep/dataforge/provider/Types.kt index d6ed723d..dfe81ce0 100644 --- a/dataforge-context/src/jvmMain/kotlin/hep/dataforge/provider/Types.kt +++ b/dataforge-context/src/jvmMain/kotlin/hep/dataforge/provider/Types.kt @@ -34,9 +34,7 @@ inline fun Provider.provideByType(name: Name): T? { inline fun Provider.top(): Map { val target = Types[T::class] - return listNames(target).associate { name -> - name to (provideByType(name) ?: error("The element $name is declared but not provided")) - } + return top(target) } /** diff --git a/dataforge-data/build.gradle.kts b/dataforge-data/build.gradle.kts index 7ebb46ce..793f551b 100644 --- a/dataforge-data/build.gradle.kts +++ b/dataforge-data/build.gradle.kts @@ -1,17 +1,14 @@ plugins { - `npm-multiplatform` + id("scientifik.mpp") } -val coroutinesVersion: String = Versions.coroutinesVersion +val coroutinesVersion: String = Scientifik.coroutinesVersion kotlin { - jvm() - js() sourceSets { val commonMain by getting{ dependencies { api(project(":dataforge-meta")) - api(kotlin("reflect")) api("org.jetbrains.kotlinx:kotlinx-coroutines-core-common:$coroutinesVersion") } } @@ -19,6 +16,7 @@ kotlin { val jvmMain by getting{ dependencies { api("org.jetbrains.kotlinx:kotlinx-coroutines-core:$coroutinesVersion") + api(kotlin("reflect")) } } diff --git a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Action.kt b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Action.kt index 228522dc..cf030c75 100644 --- a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Action.kt +++ b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Action.kt @@ -1,7 +1,6 @@ package hep.dataforge.data import hep.dataforge.meta.Meta -import hep.dataforge.names.Name /** * A simple data transformation on a data node @@ -34,19 +33,3 @@ infix fun Action.then(action: Action): A } } - -///** -// * An action that performs the same transformation on each of input data nodes. Null results are ignored. -// * The transformation is non-suspending because it is lazy. -// */ -//class PipeAction(val transform: (Name, Data, Meta) -> Data?) : Action { -// override fun invoke(node: DataNode, meta: Meta): DataNode = DataNode.build { -// node.data().forEach { (name, data) -> -// val res = transform(name, data, meta) -// if (res != null) { -// set(name, res) -// } -// } -// } -//} - diff --git a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/CoroutineMonitor.kt b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/CoroutineMonitor.kt new file mode 100644 index 00000000..8cbd6192 --- /dev/null +++ b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/CoroutineMonitor.kt @@ -0,0 +1,48 @@ +package hep.dataforge.data + +import kotlinx.coroutines.CoroutineScope +import kotlinx.coroutines.Job +import kotlin.coroutines.CoroutineContext + +/** + * A monitor of goal state that could be accessed only form inside the goal + */ +class CoroutineMonitor : CoroutineContext.Element { + override val key: CoroutineContext.Key<*> get() = CoroutineMonitor + + var totalWork: Double = 1.0 + var workDone: Double = 0.0 + var status: String = "" + + /** + * Mark the goal as started + */ + fun start() { + + } + + /** + * Mark the goal as completed + */ + fun finish() { + workDone = totalWork + } + + companion object : CoroutineContext.Key +} + +class Dependencies(val values: Collection) : CoroutineContext.Element { + override val key: CoroutineContext.Key<*> get() = Dependencies + + companion object : CoroutineContext.Key +} + +val CoroutineContext.monitor: CoroutineMonitor? get() = this[CoroutineMonitor] +val CoroutineScope.monitor: CoroutineMonitor? get() = coroutineContext.monitor + +val Job.dependencies: Collection get() = this[Dependencies]?.values ?: emptyList() + +val Job.totalWork: Double get() = dependencies.sumByDouble { totalWork } + (monitor?.totalWork ?: 0.0) +val Job.workDone: Double get() = dependencies.sumByDouble { workDone } + (monitor?.workDone ?: 0.0) +val Job.status: String get() = monitor?.status ?: "" +val Job.progress: Double get() = workDone / totalWork \ No newline at end of file diff --git a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Data.kt b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Data.kt index 20957824..9b0d9027 100644 --- a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Data.kt +++ b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Data.kt @@ -1,14 +1,17 @@ package hep.dataforge.data +import hep.dataforge.meta.EmptyMeta import hep.dataforge.meta.Meta import hep.dataforge.meta.MetaRepr import kotlinx.coroutines.CoroutineScope +import kotlin.coroutines.CoroutineContext +import kotlin.coroutines.EmptyCoroutineContext import kotlin.reflect.KClass /** * A data element characterized by its meta */ -interface Data : MetaRepr { +interface Data : Goal, MetaRepr { /** * Type marker for the data. The type is known before the calculation takes place so it could be checked. */ @@ -18,52 +21,148 @@ interface Data : MetaRepr { */ val meta: Meta - /** - * Lazy data value - */ - val goal: Goal - override fun toMeta(): Meta = meta companion object { const val TYPE = "data" - fun of(type: KClass, goal: Goal, meta: Meta): Data = DataImpl(type, goal, meta) - - inline fun of(goal: Goal, meta: Meta): Data = of(T::class, goal, meta) - - fun of(name: String, type: KClass, goal: Goal, meta: Meta): Data = - NamedData(name, of(type, goal, meta)) + operator fun invoke( + type: KClass, + meta: Meta = EmptyMeta, + context: CoroutineContext = EmptyCoroutineContext, + dependencies: Collection> = emptyList(), + block: suspend CoroutineScope.() -> T + ): Data = DynamicData(type, meta, context, dependencies, block) + + operator inline fun invoke( + meta: Meta = EmptyMeta, + context: CoroutineContext = EmptyCoroutineContext, + dependencies: Collection> = emptyList(), + noinline block: suspend CoroutineScope.() -> T + ): Data = invoke(T::class, meta, context, dependencies, block) + + operator fun invoke( + name: String, + type: KClass, + meta: Meta = EmptyMeta, + context: CoroutineContext = EmptyCoroutineContext, + dependencies: Collection> = emptyList(), + block: suspend CoroutineScope.() -> T + ): Data = NamedData(name, invoke(type, meta, context, dependencies, block)) + + operator inline fun invoke( + name: String, + meta: Meta = EmptyMeta, + context: CoroutineContext = EmptyCoroutineContext, + dependencies: Collection> = emptyList(), + noinline block: suspend CoroutineScope.() -> T + ): Data = + invoke(name, T::class, meta, context, dependencies, block) + + fun static(value: T, meta: Meta = EmptyMeta): Data = + StaticData(value, meta) + } +} - inline fun of(name: String, goal: Goal, meta: Meta): Data = - of(name, T::class, goal, meta) - fun static(scope: CoroutineScope, value: T, meta: Meta): Data = - DataImpl(value::class, Goal.static(scope, value), meta) +fun Data.cast(type: KClass): Data { + return object : Data by this { + override val type: KClass = type } } /** * Upcast a [Data] to a supertype */ -inline fun Data.cast(): Data { - return Data.of(R::class, goal, meta) +inline fun Data.cast(): Data = cast(R::class) + + +class DynamicData( + override val type: KClass, + override val meta: Meta = EmptyMeta, + context: CoroutineContext = EmptyCoroutineContext, + dependencies: Collection> = emptyList(), + block: suspend CoroutineScope.() -> T +) : Data, DynamicGoal(context, dependencies, block) + +class StaticData( + value: T, + override val meta: Meta = EmptyMeta +) : Data, StaticGoal(value) { + override val type: KClass get() = value::class } -fun Data.cast(type: KClass): Data { - return Data.of(type, goal, meta) +class NamedData(val name: String, data: Data) : Data by data + +fun Data.pipe( + outputType: KClass, + coroutineContext: CoroutineContext = EmptyCoroutineContext, + meta: Meta = this.meta, + block: suspend CoroutineScope.(T) -> R +): Data = DynamicData(outputType, meta, coroutineContext, listOf(this)) { + block(await(this)) } -suspend fun Data.await(): T = goal.await() /** - * Generic Data implementation + * Create a data pipe */ -private class DataImpl( - override val type: KClass, - override val goal: Goal, - override val meta: Meta -) : Data +inline fun Data.pipe( + coroutineContext: CoroutineContext = EmptyCoroutineContext, + meta: Meta = this.meta, + noinline block: suspend CoroutineScope.(T) -> R +): Data = DynamicData(R::class, meta, coroutineContext, listOf(this)) { + block(await(this)) +} + +/** + * Create a joined data. + */ +inline fun Collection>.join( + coroutineContext: CoroutineContext = EmptyCoroutineContext, + meta: Meta, + noinline block: suspend CoroutineScope.(Collection) -> R +): Data = DynamicData( + R::class, + meta, + coroutineContext, + this +) { + block(map { this.run { it.await(this) } }) +} + +fun Map>.join( + outputType: KClass, + coroutineContext: CoroutineContext = EmptyCoroutineContext, + meta: Meta, + block: suspend CoroutineScope.(Map) -> R +): DynamicData = DynamicData( + outputType, + meta, + coroutineContext, + this.values +) { + block(mapValues { it.value.await(this) }) +} + + +/** + * A joining of multiple data into a single one + * @param K type of the map key + * @param T type of the input goal + * @param R type of the result goal + */ +inline fun Map>.join( + coroutineContext: CoroutineContext = EmptyCoroutineContext, + meta: Meta, + noinline block: suspend CoroutineScope.(Map) -> R +): DynamicData = DynamicData( + R::class, + meta, + coroutineContext, + this.values +) { + block(mapValues { it.value.await(this) }) +} -class NamedData(val name: String, data: Data) : Data by data diff --git a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/DataFilter.kt b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/DataFilter.kt index 6c920f57..a23b550d 100644 --- a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/DataFilter.kt +++ b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/DataFilter.kt @@ -20,10 +20,10 @@ class DataFilter(override val config: Config) : Specific { * Apply meta-based filter to given data node */ fun DataNode.filter(filter: DataFilter): DataNode { - val sourceNode = filter.from?.let { getNode(it.toName()) } ?: this@filter + val sourceNode = filter.from?.let { get(it.toName()).node } ?: this@filter val regex = filter.pattern.toRegex() val targetNode = DataTreeBuilder(type).apply { - sourceNode.data().forEach { (name, data) -> + sourceNode.dataSequence().forEach { (name, data) -> if (name.toString().matches(regex)) { this[name] = data } diff --git a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/DataNode.kt b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/DataNode.kt index 02fc6a9e..a407b512 100644 --- a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/DataNode.kt +++ b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/DataNode.kt @@ -1,8 +1,26 @@ package hep.dataforge.data import hep.dataforge.names.* +import kotlinx.coroutines.CoroutineScope +import kotlinx.coroutines.Job +import kotlinx.coroutines.launch +import kotlin.collections.component1 +import kotlin.collections.component2 +import kotlin.collections.set import kotlin.reflect.KClass +sealed class DataItem { + abstract val type: KClass + + class Node(val value: DataNode) : DataItem() { + override val type: KClass get() = value.type + } + + class Leaf(val value: Data) : DataItem() { + override val type: KClass get() = value.type + } +} + /** * A tree-like data structure grouped into the node. All data inside the node must inherit its type */ @@ -13,93 +31,89 @@ interface DataNode { */ val type: KClass - /** - * Get the specific data if it exists - */ - operator fun get(name: Name): Data? - - /** - * Get a subnode with given name if it exists. - */ - fun getNode(name: Name): DataNode? - - /** - * Walk the tree upside down and provide all data nodes with full names - */ - fun data(): Sequence>> - - /** - * A sequence of all nodes in the tree walking upside down, excluding self - */ - fun nodes(): Sequence>> - - operator fun iterator(): Iterator>> = data().iterator() + val items: Map> companion object { const val TYPE = "dataNode" fun build(type: KClass, block: DataTreeBuilder.() -> Unit) = - DataTreeBuilder(type).apply(block).build() + DataTreeBuilder(type).apply(block).build() fun builder(type: KClass) = DataTreeBuilder(type) } } -internal sealed class DataTreeItem { - class Node(val tree: DataTree) : DataTreeItem() - class Value(val value: Data) : DataTreeItem() +val DataItem?.node: DataNode? get() = (this as? DataItem.Node)?.value +val DataItem?.data: Data? get() = (this as? DataItem.Leaf)?.value + +/** + * Start computation for all goals in data node + */ +fun DataNode<*>.startAll(scope: CoroutineScope): Unit = items.values.forEach { + when (it) { + is DataItem.Node<*> -> it.value.startAll(scope) + is DataItem.Leaf<*> -> it.value.start(scope) + } } -class DataTree internal constructor( - override val type: KClass, - private val items: Map> -) : DataNode { - //TODO add node-level meta? +fun DataNode<*>.joinAll(scope: CoroutineScope): Job = scope.launch { + startAll(scope) + items.forEach { + when (val value = it.value) { + is DataItem.Node -> value.value.joinAll(this).join() + is DataItem.Leaf -> value.value.await(scope) + } + } +} - override fun get(name: Name): Data? = when (name.length) { - 0 -> error("Empty name") - 1 -> (items[name.first()] as? DataTreeItem.Value)?.value - else -> getNode(name.first()!!.asName())?.get(name.cutFirst()) - } - - override fun getNode(name: Name): DataTree? = when (name.length) { - 0 -> this - 1 -> (items[name.first()] as? DataTreeItem.Node)?.tree - else -> getNode(name.first()!!.asName())?.getNode(name.cutFirst()) - } - - override fun data(): Sequence>> { - return sequence { - items.forEach { (head, tree) -> - when (tree) { - is DataTreeItem.Value -> yield(head.asName() to tree.value) - is DataTreeItem.Node -> { - val subSequence = - tree.tree.data().map { (name, data) -> (head.asName() + name) to data } - yieldAll(subSequence) - } - } - } +operator fun DataNode.get(name: Name): DataItem? = when (name.length) { + 0 -> error("Empty name") + 1 -> (items[name.first()] as? DataItem.Leaf) + else -> get(name.first()!!.asName()).node?.get(name.cutFirst()) +} + +/** + * Sequence of all children including nodes + */ +fun DataNode.asSequence(): Sequence>> = sequence { + items.forEach { (head, item) -> + yield(head.asName() to item) + if (item is DataItem.Node) { + val subSequence = item.value.asSequence() + .map { (name, data) -> (head.asName() + name) to data } + yieldAll(subSequence) } } +} - override fun nodes(): Sequence>> { - return sequence { - items.forEach { (head, tree) -> - if (tree is DataTreeItem.Node) { - yield(head.asName() to tree.tree) - val subSequence = - tree.tree.nodes().map { (name, node) -> (head.asName() + name) to node } - yieldAll(subSequence) - } +/** + * Sequence of data entries + */ +fun DataNode.dataSequence(): Sequence>> = sequence { + items.forEach { (head, item) -> + when (item) { + is DataItem.Leaf -> yield(head.asName() to item.value) + is DataItem.Node -> { + val subSequence = item.value.dataSequence() + .map { (name, data) -> (head.asName() + name) to data } + yieldAll(subSequence) } } } } +operator fun DataNode.iterator(): Iterator>> = asSequence().iterator() + +class DataTree internal constructor( + override val type: KClass, + override val items: Map> +) : DataNode { + //TODO add node-level meta? +} + private sealed class DataTreeBuilderItem { class Node(val tree: DataTreeBuilder) : DataTreeBuilderItem() - class Value(val value: Data) : DataTreeBuilderItem() + class Leaf(val value: Data) : DataTreeBuilderItem() } /** @@ -115,7 +129,7 @@ class DataTreeBuilder(private val type: KClass) { operator fun set(token: NameToken, data: Data) { if (map.containsKey(token)) error("Tree entry with name $token is not empty") - map[token] = DataTreeBuilderItem.Value(data) + map[token] = DataTreeBuilderItem.Leaf(data) } private fun buildNode(token: NameToken): DataTreeBuilder { @@ -152,6 +166,11 @@ class DataTreeBuilder(private val type: KClass) { operator fun set(name: Name, node: DataNode) = set(name, node.builder()) + operator fun set(name: Name, item: DataItem) = when (item) { + is DataItem.Node -> set(name, item.value.builder()) + is DataItem.Leaf -> set(name, item.value) + } + /** * Append data to node */ @@ -162,14 +181,16 @@ class DataTreeBuilder(private val type: KClass) { */ infix fun String.to(node: DataNode) = set(toName(), node) + infix fun String.to(item: DataItem) = set(toName(), item) + /** * Build and append node */ infix fun String.to(block: DataTreeBuilder.() -> Unit) = set(toName(), DataTreeBuilder(type).apply(block)) - fun update(node: DataNode){ - node.data().forEach { + fun update(node: DataNode) { + node.dataSequence().forEach { //TODO check if the place is occupied this[it.first] = it.second } @@ -178,8 +199,8 @@ class DataTreeBuilder(private val type: KClass) { fun build(): DataTree { val resMap = map.mapValues { (_, value) -> when (value) { - is DataTreeBuilderItem.Value -> DataTreeItem.Value(value.value) - is DataTreeBuilderItem.Node -> DataTreeItem.Node(value.tree.build()) + is DataTreeBuilderItem.Leaf -> DataItem.Leaf(value.value) + is DataTreeBuilderItem.Node -> DataItem.Node(value.tree.build()) } } return DataTree(type, resMap) @@ -190,27 +211,20 @@ class DataTreeBuilder(private val type: KClass) { * Generate a mutable builder from this node. Node content is not changed */ fun DataNode.builder(): DataTreeBuilder = DataTreeBuilder(type).apply { - data().forEach { (name, data) -> this[name] = data } + dataSequence().forEach { (name, data) -> this[name] = data } } -/** - * Start computation for all goals in data node - */ -fun DataNode<*>.startAll() = data().forEach { (_, data) -> data.goal.start() } - fun DataNode.filter(predicate: (Name, Data) -> Boolean): DataNode = DataNode.build(type) { - data().forEach { (name, data) -> + dataSequence().forEach { (name, data) -> if (predicate(name, data)) { this[name] = data } } } -fun DataNode.first(): Data = data().first().second +fun DataNode.first(): Data? = dataSequence().first().second /** * Check that node is compatible with given type meaning that each element could be cast to the type */ -expect fun DataNode<*>.checkType(type: KClass<*>) - -//fun DataNode.filterIsInstance(type: KClass): DataNode = filter{_,data -> type.} \ No newline at end of file +expect fun DataNode<*>.checkType(type: KClass<*>) \ No newline at end of file diff --git a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Goal.kt b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Goal.kt index 991ddbdb..54bb743e 100644 --- a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Goal.kt +++ b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Goal.kt @@ -4,116 +4,102 @@ import kotlinx.coroutines.* import kotlin.coroutines.CoroutineContext import kotlin.coroutines.EmptyCoroutineContext -/** - * A special deferred with explicit dependencies and some additional information like progress and unique id - */ -interface Goal : Deferred, CoroutineScope { - val scope: CoroutineScope - override val coroutineContext get() = scope.coroutineContext - +interface Goal { val dependencies: Collection> - - val totalWork: Double get() = dependencies.sumByDouble { totalWork } + (monitor?.totalWork ?: 0.0) - val workDone: Double get() = dependencies.sumByDouble { workDone } + (monitor?.workDone ?: 0.0) - val status: String get() = monitor?.status ?: "" - val progress: Double get() = workDone / totalWork - - companion object { - /** - * Create goal wrapping static value. This goal is always completed - */ - fun static(scope: CoroutineScope, value: T): Goal = - StaticGoalImpl(scope, CompletableDeferred(value)) - } -} - -/** - * A monitor of goal state that could be accessed only form inside the goal - */ -class GoalMonitor : CoroutineContext.Element { - override val key: CoroutineContext.Key<*> get() = GoalMonitor - - var totalWork: Double = 1.0 - var workDone: Double = 0.0 - var status: String = "" + /** + * Returns current running coroutine if the goal is started + */ + val result: Deferred? /** - * Mark the goal as started + * Get ongoing computation or start a new one. + * Does not guarantee thread safety. In case of multi-thread access, could create orphan computations. */ - fun start() { + fun startAsync(scope: CoroutineScope): Deferred - } + suspend fun CoroutineScope.await(): T = startAsync(this).await() /** - * Mark the goal as completed + * Reset the computation */ - fun finish() { - workDone = totalWork - } + fun reset() + + companion object { - companion object : CoroutineContext.Key + } } -val CoroutineScope.monitor: GoalMonitor? get() = coroutineContext[GoalMonitor] +fun Goal<*>.start(scope: CoroutineScope): Job = startAsync(scope) + +val Goal<*>.isComplete get() = result?.isCompleted ?: false -private class GoalImpl( - override val scope: CoroutineScope, - override val dependencies: Collection>, - deferred: Deferred -) : Goal, Deferred by deferred +suspend fun Goal.await(scope: CoroutineScope): T = scope.await() -private class StaticGoalImpl(override val scope: CoroutineScope, deferred: CompletableDeferred) : Goal, - Deferred by deferred { +open class StaticGoal(val value: T) : Goal { override val dependencies: Collection> get() = emptyList() - override val status: String get() = "" - override val totalWork: Double get() = 0.0 - override val workDone: Double get() = 0.0 + override val result: Deferred = CompletableDeferred(value) + + override fun startAsync(scope: CoroutineScope): Deferred = result + + override fun reset() { + //doNothing + } } +open class DynamicGoal( + val coroutineContext: CoroutineContext = EmptyCoroutineContext, + override val dependencies: Collection> = emptyList(), + val block: suspend CoroutineScope.() -> T +) : Goal { -/** - * Create a new [Goal] with given [dependencies] and execution [block]. The block takes monitor as parameter. - * The goal block runs in a supervised scope, meaning that when it fails, it won't affect external scope. - * - * **Important:** Unlike regular deferred, the [Goal] is started lazily, so the actual calculation is called only when result is requested. - */ -fun CoroutineScope.createGoal( - dependencies: Collection>, - context: CoroutineContext = EmptyCoroutineContext, - block: suspend CoroutineScope.() -> R -): Goal { - val deferred = async(context + GoalMonitor(), start = CoroutineStart.LAZY) { - dependencies.forEach { it.start() } - monitor?.start() - //Running in supervisor scope in order to allow manual error handling - return@async supervisorScope { - block().also { - monitor?.finish() - } + final override var result: Deferred? = null + private set + + /** + * Get ongoing computation or start a new one. + * Does not guarantee thread safety. In case of multi-thread access, could create orphan computations. + */ + override fun startAsync(scope: CoroutineScope): Deferred { + val startedDependencies = this.dependencies.map { goal -> + goal.startAsync(scope) } + return result ?: scope.async(coroutineContext + CoroutineMonitor() + Dependencies(startedDependencies)) { + startedDependencies.forEach { deferred -> + deferred.invokeOnCompletion { error -> + if (error != null) cancel(CancellationException("Dependency $deferred failed with error: ${error.message}")) + } + } + block() + }.also { result = it } } - return GoalImpl(this, dependencies, deferred) + /** + * Reset the computation + */ + override fun reset() { + result?.cancel() + result = null + } } /** * Create a one-to-one goal based on existing goal */ fun Goal.pipe( - context: CoroutineContext = EmptyCoroutineContext, + coroutineContext: CoroutineContext = EmptyCoroutineContext, block: suspend CoroutineScope.(T) -> R -): Goal = createGoal(listOf(this), context) { block(await()) } +): Goal = DynamicGoal(coroutineContext, listOf(this)) { + block(await(this)) +} /** * Create a joining goal. - * @param scope the scope for resulting goal. By default use first goal in list */ fun Collection>.join( - scope: CoroutineScope = first(), - context: CoroutineContext = EmptyCoroutineContext, + coroutineContext: CoroutineContext = EmptyCoroutineContext, block: suspend CoroutineScope.(Collection) -> R -): Goal = scope.createGoal(this, context) { - block(map { it.await() }) +): Goal = DynamicGoal(coroutineContext, this) { + block(map { this.run { it.await(this) } }) } /** @@ -123,9 +109,9 @@ fun Collection>.join( * @param R type of the result goal */ fun Map>.join( - scope: CoroutineScope = values.first(), - context: CoroutineContext = EmptyCoroutineContext, + coroutineContext: CoroutineContext = EmptyCoroutineContext, block: suspend CoroutineScope.(Map) -> R -): Goal = scope.createGoal(this.values, context) { - block(mapValues { it.value.await() }) -} \ No newline at end of file +): Goal = DynamicGoal(coroutineContext, this.values) { + block(mapValues { it.value.await(this) }) +} + diff --git a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/GroupBuilder.kt b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/GroupBuilder.kt deleted file mode 100644 index 0820c162..00000000 --- a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/GroupBuilder.kt +++ /dev/null @@ -1,75 +0,0 @@ -/* - * Copyright 2015 Alexander Nozik. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ -package hep.dataforge.data - -import hep.dataforge.meta.Meta -import hep.dataforge.meta.get -import hep.dataforge.meta.string - -interface GroupRule { - operator fun invoke(node: DataNode): Map> -} - -/** - * The class to builder groups of content with annotation defined rules - * - * @author Alexander Nozik - */ - -object GroupBuilder { - - /** - * Create grouping rule that creates groups for different values of value - * field with name [key] - * - * @param key - * @param defaultTagValue - * @return - */ - fun byValue(key: String, defaultTagValue: String): GroupRule = object : - GroupRule { - override fun invoke(node: DataNode): Map> { - val map = HashMap>() - - node.data().forEach { (name, data) -> - val tagValue = data.meta[key]?.string ?: defaultTagValue - map.getOrPut(tagValue) { DataNode.builder(node.type) }[name] = data - } - - return map.mapValues { it.value.build() } - } - } - - - // @ValueDef(key = "byValue", required = true, info = "The name of annotation value by which grouping should be made") -// @ValueDef( -// key = "defaultValue", -// def = "default", -// info = "Default value which should be used for content in which the grouping value is not presented" -// ) - fun byMeta(config: Meta): GroupRule { - //TODO expand grouping options - return config["byValue"]?.string?.let { - byValue( - it, - config["defaultValue"]?.string ?: "default" - ) - } - ?: object : GroupRule { - override fun invoke(node: DataNode): Map> = mapOf("" to node) - } - } -} diff --git a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/GroupRule.kt b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/GroupRule.kt new file mode 100644 index 00000000..5cfc55e8 --- /dev/null +++ b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/GroupRule.kt @@ -0,0 +1,68 @@ +/* + * Copyright 2015 Alexander Nozik. + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package hep.dataforge.data + +import hep.dataforge.meta.Meta +import hep.dataforge.meta.get +import hep.dataforge.meta.string + +interface GroupRule { + operator fun invoke(node: DataNode): Map> + + companion object{ + /** + * Create grouping rule that creates groups for different values of value + * field with name [key] + * + * @param key + * @param defaultTagValue + * @return + */ + fun byValue(key: String, defaultTagValue: String): GroupRule = object : + GroupRule { + override fun invoke(node: DataNode): Map> { + val map = HashMap>() + + node.dataSequence().forEach { (name, data) -> + val tagValue = data.meta[key]?.string ?: defaultTagValue + map.getOrPut(tagValue) { DataNode.builder(node.type) }[name] = data + } + + return map.mapValues { it.value.build() } + } + } + + + // @ValueDef(key = "byValue", required = true, info = "The name of annotation value by which grouping should be made") +// @ValueDef( +// key = "defaultValue", +// def = "default", +// info = "Default value which should be used for content in which the grouping value is not presented" +// ) + fun byMeta(config: Meta): GroupRule { + //TODO expand grouping options + return config["byValue"]?.string?.let { + byValue( + it, + config["defaultValue"]?.string ?: "default" + ) + } + ?: object : GroupRule { + override fun invoke(node: DataNode): Map> = mapOf("" to node) + } + } + } +} diff --git a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/JoinAction.kt b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/JoinAction.kt index 2f5979fe..4acae87f 100644 --- a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/JoinAction.kt +++ b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/JoinAction.kt @@ -6,8 +6,6 @@ import hep.dataforge.meta.MetaBuilder import hep.dataforge.meta.builder import hep.dataforge.names.Name import hep.dataforge.names.toName -import kotlin.coroutines.CoroutineContext -import kotlin.coroutines.EmptyCoroutineContext import kotlin.reflect.KClass @@ -31,7 +29,7 @@ class JoinGroupBuilder(val actionMeta: Meta) { */ fun byValue(tag: String, defaultTag: String = "@default", action: JoinGroup.() -> Unit) { groupRules += { node -> - GroupBuilder.byValue(tag, defaultTag).invoke(node).map { + GroupRule.byValue(tag, defaultTag).invoke(node).map { JoinGroup(it.key, it.value).apply(action) } } @@ -78,7 +76,6 @@ class JoinGroupBuilder(val actionMeta: Meta) { class JoinAction( val inputType: KClass, val outputType: KClass, - val context: CoroutineContext = EmptyCoroutineContext, private val action: JoinGroupBuilder.() -> Unit ) : Action { @@ -89,17 +86,13 @@ class JoinAction( val laminate = Laminate(group.meta, meta) - val goalMap: Map> = group.node - .data() - .associate { it.first to it.second.goal } + val dataMap = group.node.dataSequence().associate { it } val groupName: String = group.name; val env = ActionEnv(groupName.toName(), laminate.builder()) - val goal = goalMap.join(context = context) { group.result.invoke(env, it) } - - val res = Data.of(outputType, goal, env.meta) + val res: DynamicData = dataMap.join(outputType, meta = laminate) { group.result.invoke(env, it) } set(env.name, res) } @@ -108,4 +101,4 @@ class JoinAction( } } -operator fun Map.get(name:String) = get(name.toName()) +operator fun Map.get(name: String) = get(name.toName()) diff --git a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/PipeAction.kt b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/PipeAction.kt index f106df40..c84e5a13 100644 --- a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/PipeAction.kt +++ b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/PipeAction.kt @@ -2,8 +2,6 @@ package hep.dataforge.data import hep.dataforge.meta.* import hep.dataforge.names.Name -import kotlin.coroutines.CoroutineContext -import kotlin.coroutines.EmptyCoroutineContext import kotlin.reflect.KClass class ActionEnv(val name: Name, val meta: Meta) @@ -27,7 +25,6 @@ class PipeBuilder(var name: Name, var meta: MetaBuilder) { class PipeAction( val inputType: KClass, val outputType: KClass, - val context: CoroutineContext = EmptyCoroutineContext, private val block: PipeBuilder.() -> Unit ) : Action { @@ -35,7 +32,7 @@ class PipeAction( node.checkType(inputType) return DataNode.build(outputType) { - node.data().forEach { (name, data) -> + node.dataSequence().forEach { (name, data) -> //merging data meta with action meta (data meta is primary) val oldMeta = meta.builder().apply { update(data.meta) } // creating environment from old meta and name @@ -46,10 +43,9 @@ class PipeAction( val newName = builder.name //getting new meta val newMeta = builder.meta.seal() - //creating a goal with custom context if provided - val goal = data.goal.pipe(context) { builder.result(env, it) } + val newData = data.pipe(outputType, meta = newMeta) { builder.result(env, it) } //setting the data node - this[newName] = Data.of(outputType, goal, newMeta) + this[newName] = newData } } } @@ -57,9 +53,8 @@ class PipeAction( inline fun DataNode.pipe( meta: Meta, - context: CoroutineContext = EmptyCoroutineContext, noinline action: PipeBuilder.() -> Unit -): DataNode = PipeAction(T::class, R::class, context, action).invoke(this, meta) +): DataNode = PipeAction(T::class, R::class, action).invoke(this, meta) diff --git a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/SplitAction.kt b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/SplitAction.kt index 3aa08990..be9764a6 100644 --- a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/SplitAction.kt +++ b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/SplitAction.kt @@ -7,8 +7,6 @@ import hep.dataforge.meta.builder import hep.dataforge.names.Name import hep.dataforge.names.toName import kotlin.collections.set -import kotlin.coroutines.CoroutineContext -import kotlin.coroutines.EmptyCoroutineContext import kotlin.reflect.KClass @@ -37,7 +35,6 @@ class SplitBuilder(val name: Name, val meta: Meta) { class SplitAction( val inputType: KClass, val outputType: KClass, - val context: CoroutineContext = EmptyCoroutineContext, private val action: SplitBuilder.() -> Unit ) : Action { @@ -45,7 +42,7 @@ class SplitAction( node.checkType(inputType) return DataNode.build(outputType) { - node.data().forEach { (name, data) -> + node.dataSequence().forEach { (name, data) -> val laminate = Laminate(data.meta, meta) @@ -58,9 +55,7 @@ class SplitAction( rule(env) - val goal = data.goal.pipe(context = context) { env.result(it) } - - val res = Data.of(outputType, goal, env.meta) + val res = data.pipe(outputType, meta = env.meta) { env.result(it) } set(env.name, res) } } diff --git a/dataforge-data/src/jvmMain/kotlin/hep/dataforge/data/CastDataNode.kt b/dataforge-data/src/jvmMain/kotlin/hep/dataforge/data/CastDataNode.kt index 9ff277cd..324864fc 100644 --- a/dataforge-data/src/jvmMain/kotlin/hep/dataforge/data/CastDataNode.kt +++ b/dataforge-data/src/jvmMain/kotlin/hep/dataforge/data/CastDataNode.kt @@ -1,13 +1,23 @@ package hep.dataforge.data -import hep.dataforge.names.Name +import hep.dataforge.meta.Meta +import hep.dataforge.names.NameToken +import kotlinx.coroutines.CoroutineScope +import kotlinx.coroutines.Deferred import kotlin.reflect.KClass import kotlin.reflect.full.isSubclassOf -fun Data.safeCast(type: KClass): Data? { - return if (type.isSubclassOf(type)) { - @Suppress("UNCHECKED_CAST") - Data.of(type, goal as Goal, meta) +@Suppress("UNCHECKED_CAST") +fun Data.safeCast(type: KClass): Data? { + return if (this.type.isSubclassOf(type)) { + return object : Data { + override val meta: Meta get() = this@safeCast.meta + override val dependencies: Collection> get() = this@safeCast.dependencies + override val result: Deferred? get() = this@safeCast.result as Deferred + override fun startAsync(scope: CoroutineScope): Deferred = this@safeCast.startAsync(scope) as Deferred + override fun reset() = this@safeCast.reset() + override val type: KClass = type + } } else { null } @@ -17,7 +27,7 @@ fun Data.safeCast(type: KClass): Data? { * Filter a node by data and node type. Resulting node and its subnodes is guaranteed to have border type [type], * but could contain empty nodes */ -fun DataNode.cast(type: KClass): DataNode { +fun DataNode.cast(type: KClass): DataNode { return if (this is CastDataNode) { origin.cast(type) } else { @@ -28,19 +38,18 @@ fun DataNode.cast(type: KClass): DataNode { inline fun DataNode.cast(): DataNode = cast(R::class) class CastDataNode(val origin: DataNode, override val type: KClass) : DataNode { - - override fun get(name: Name): Data? = - origin[name]?.safeCast(type) - - override fun getNode(name: Name): DataNode? { - return origin.getNode(name)?.cast(type) + override val items: Map> by lazy { + origin.items.mapNotNull { (key, item) -> + when (item) { + is DataItem.Leaf -> { + (item.value.safeCast(type))?.let { + key to DataItem.Leaf(it) + } + } + is DataItem.Node -> { + key to DataItem.Node(item.value.cast(type)) + } + } + }.associate { it } } - - override fun data(): Sequence>> = - origin.data().mapNotNull { pair -> - pair.second.safeCast(type)?.let { pair.first to it } - } - - override fun nodes(): Sequence>> = - origin.nodes().map { it.first to it.second.cast(type) } } \ No newline at end of file diff --git a/dataforge-data/src/jvmMain/kotlin/hep/dataforge/data/_Data.kt b/dataforge-data/src/jvmMain/kotlin/hep/dataforge/data/_Data.kt deleted file mode 100644 index 00c9e656..00000000 --- a/dataforge-data/src/jvmMain/kotlin/hep/dataforge/data/_Data.kt +++ /dev/null @@ -1,8 +0,0 @@ -package hep.dataforge.data - -import kotlinx.coroutines.runBlocking - -/** - * Block the thread and get data content - */ -fun Data.get(): T = runBlocking { await() } \ No newline at end of file diff --git a/dataforge-data/src/jvmMain/kotlin/hep/dataforge/data/checkType.kt b/dataforge-data/src/jvmMain/kotlin/hep/dataforge/data/dataJVM.kt similarity index 71% rename from dataforge-data/src/jvmMain/kotlin/hep/dataforge/data/checkType.kt rename to dataforge-data/src/jvmMain/kotlin/hep/dataforge/data/dataJVM.kt index 0b4c602f..f87c4155 100644 --- a/dataforge-data/src/jvmMain/kotlin/hep/dataforge/data/checkType.kt +++ b/dataforge-data/src/jvmMain/kotlin/hep/dataforge/data/dataJVM.kt @@ -1,8 +1,14 @@ package hep.dataforge.data +import kotlinx.coroutines.runBlocking import kotlin.reflect.KClass import kotlin.reflect.full.isSuperclassOf +/** + * Block the thread and get data content + */ +fun Data.get(): T = runBlocking { await() } + /** * Check that node is compatible with given type meaning that each element could be cast to the type */ diff --git a/dataforge-io/build.gradle.kts b/dataforge-io/build.gradle.kts index 67ffe124..b4306efc 100644 --- a/dataforge-io/build.gradle.kts +++ b/dataforge-io/build.gradle.kts @@ -1,57 +1,26 @@ plugins { - `npm-multiplatform` + id("scientifik.mpp") } -description = "IO for meta" +description = "IO module" +scientifik{ + serialization = true + io = true +} -val ioVersion: String = Versions.ioVersion -val serializationVersion: String = Versions.serializationVersion kotlin { - jvm() - js() sourceSets { - val commonMain by getting{ - dependencies { - api(project(":dataforge-meta")) - //implementation 'org.jetbrains.kotlin:kotlin-reflect' - api("org.jetbrains.kotlinx:kotlinx-serialization-runtime-common:$serializationVersion") - api("org.jetbrains.kotlinx:kotlinx-io:$ioVersion") - } - } - val commonTest by getting { - dependencies { - implementation("org.jetbrains.kotlin:kotlin-test-common") - implementation("org.jetbrains.kotlin:kotlin-test-annotations-common") - } - } - val jvmMain by getting { + commonMain{ dependencies { - api("org.jetbrains.kotlinx:kotlinx-serialization-runtime:$serializationVersion") - api("org.jetbrains.kotlinx:kotlinx-io-jvm:$ioVersion") + api(project(":dataforge-context")) } } - val jvmTest by getting { - dependencies { - implementation("org.jetbrains.kotlin:kotlin-test") - implementation("org.jetbrains.kotlin:kotlin-test-junit") - } - } - val jsMain by getting { - dependencies { - api("org.jetbrains.kotlinx:kotlinx-serialization-runtime-js:$serializationVersion") - api("org.jetbrains.kotlinx:kotlinx-io-js:$ioVersion") - } - } - val jsTest by getting { - dependencies { - implementation("org.jetbrains.kotlin:kotlin-test-js") + jsMain{ + dependencies{ + api(npm("text-encoding")) } } -// iosMain { -// } -// iosTest { -// } } } \ No newline at end of file diff --git a/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/Binary.kt b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/Binary.kt new file mode 100644 index 00000000..67c962dd --- /dev/null +++ b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/Binary.kt @@ -0,0 +1,85 @@ +package hep.dataforge.io + +import kotlinx.io.core.ByteReadPacket +import kotlinx.io.core.Input +import kotlinx.io.core.buildPacket +import kotlinx.io.core.readBytes + +/** + * A source of binary data + */ +interface Binary { + /** + * The size of binary in bytes + */ + val size: ULong + + /** + * Read continuous [Input] from this binary stating from the beginning. + * The input is automatically closed on scope close. + * Some implementation may forbid this to be called twice. In this case second call will throw an exception. + */ + fun read(block: Input.() -> R): R +} + +/** + * A [Binary] with addition random access functionality. It by default allows multiple [read] operations. + */ +@ExperimentalUnsignedTypes +interface RandomAccessBinary : Binary { + /** + * Read at most [size] of bytes starting at [from] offset from the beginning of the binary. + * This method could be called multiple times simultaneously. + */ + fun read(from: UInt, size: UInt = UInt.MAX_VALUE, block: Input.() -> R): R + + override fun read(block: Input.() -> R): R = read(0.toUInt(), UInt.MAX_VALUE, block) +} + +fun Binary.readAll(): ByteReadPacket = read { + ByteReadPacket(this.readBytes()) +} + +@ExperimentalUnsignedTypes +fun RandomAccessBinary.readPacket(from: UInt, size: UInt): ByteReadPacket = read(from, size) { + ByteReadPacket(this.readBytes()) +} + +@ExperimentalUnsignedTypes +object EmptyBinary : RandomAccessBinary { + + override val size: ULong = 0.toULong() + + override fun read(from: UInt, size: UInt, block: Input.() -> R): R { + error("The binary is empty") + } +} + +@ExperimentalUnsignedTypes +class ArrayBinary(val array: ByteArray) : RandomAccessBinary { + override val size: ULong get() = array.size.toULong() + + override fun read(from: UInt, size: UInt, block: Input.() -> R): R { + return ByteReadPacket(array, from.toInt(), size.toInt()).block() + } +} + +/** + * Read given binary as object using given format + */ +fun Binary.readWith(format: IOFormat): T = format.run { + read { + readThis() + } +} + +/** + * Write this object to a binary + * TODO make a lazy binary that does not use intermediate array + */ +fun T.writeWith(format: IOFormat): Binary = format.run{ + val packet = buildPacket { + writeThis(this@writeWith) + } + return@run ArrayBinary(packet.readBytes()) +} \ No newline at end of file diff --git a/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/BinaryMetaFormat.kt b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/BinaryMetaFormat.kt index 0e73934a..daf08756 100644 --- a/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/BinaryMetaFormat.kt +++ b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/BinaryMetaFormat.kt @@ -1,5 +1,6 @@ package hep.dataforge.io +import hep.dataforge.descriptors.NodeDescriptor import hep.dataforge.meta.* import hep.dataforge.values.* import kotlinx.io.core.Input @@ -8,12 +9,11 @@ import kotlinx.io.core.readText import kotlinx.io.core.writeText object BinaryMetaFormat : MetaFormat { - override fun write(obj: Meta, out: Output) { - out.writeMeta(obj) - } + override val name: String = "bin" + override val key: Short = 0x4249//BI - override fun read(input: Input): Meta { - return (input.readMetaItem() as MetaItem.NodeItem).node + override fun Input.readMeta(descriptor: NodeDescriptor?): Meta { + return (readMetaItem() as MetaItem.NodeItem).node } private fun Output.writeChar(char: Char) = writeByte(char.toByte()) @@ -70,7 +70,7 @@ object BinaryMetaFormat : MetaFormat { } } - private fun Output.writeMeta(meta: Meta) { + override fun Output.writeMeta(meta: Meta, descriptor: NodeDescriptor?) { writeChar('M') writeInt(meta.items.size) meta.items.forEach { (key, item) -> @@ -80,7 +80,7 @@ object BinaryMetaFormat : MetaFormat { writeValue(item.value) } is MetaItem.NodeItem -> { - writeMeta(item.node) + writeThis(item.node) } } } @@ -91,9 +91,9 @@ object BinaryMetaFormat : MetaFormat { return readText(max = length) } + @Suppress("UNCHECKED_CAST") private fun Input.readMetaItem(): MetaItem { - val keyChar = readByte().toChar() - return when (keyChar) { + return when (val keyChar = readByte().toChar()) { 'S' -> MetaItem.ValueItem(StringValue(readString())) 'N' -> MetaItem.ValueItem(Null) '+' -> MetaItem.ValueItem(True) @@ -122,11 +122,4 @@ object BinaryMetaFormat : MetaFormat { else -> error("Unknown serialization key character: $keyChar") } } -} - -class BinaryMetaFormatFactory : MetaFormatFactory { - override val name: String = "bin" - override val key: Short = 0x4249//BI - - override fun build(): MetaFormat = BinaryMetaFormat } \ No newline at end of file diff --git a/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/Envelope.kt b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/Envelope.kt index 1a9e58d7..c2abca21 100644 --- a/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/Envelope.kt +++ b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/Envelope.kt @@ -1,13 +1,13 @@ package hep.dataforge.io +import hep.dataforge.meta.Laminate import hep.dataforge.meta.Meta import hep.dataforge.meta.get import hep.dataforge.meta.string -import kotlinx.io.core.Input interface Envelope { val meta: Meta - val data: Input? + val data: Binary? companion object { @@ -23,11 +23,7 @@ interface Envelope { } } -class SimpleEnvelope(override val meta: Meta, val dataProvider: () -> Input?) : Envelope{ - override val data: Input? - get() = dataProvider() - -} +class SimpleEnvelope(override val meta: Meta, override val data: Binary?) : Envelope /** * The purpose of the envelope @@ -50,3 +46,21 @@ val Envelope.dataType: String? get() = meta[Envelope.ENVELOPE_DATA_TYPE_KEY].str */ val Envelope.description: String? get() = meta[Envelope.ENVELOPE_DESCRIPTION_KEY].string +/** + * An envelope, which wraps existing envelope and adds one or several additional layers of meta + */ +class ProxyEnvelope(val source: Envelope, vararg meta: Meta) : Envelope { + override val meta: Laminate = Laminate(*meta, source.meta) + override val data: Binary? get() = source.data +} + +/** + * Add few meta layers to existing envelope + */ +fun Envelope.withMetaLayers(vararg layers: Meta): Envelope { + return when { + layers.isEmpty() -> this + this is ProxyEnvelope -> ProxyEnvelope(source, *layers, *this.meta.layers.toTypedArray()) + else -> ProxyEnvelope(this, *layers) + } +} \ No newline at end of file diff --git a/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/EnvelopeFormat.kt b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/EnvelopeFormat.kt new file mode 100644 index 00000000..24217e14 --- /dev/null +++ b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/EnvelopeFormat.kt @@ -0,0 +1,31 @@ +package hep.dataforge.io + +import hep.dataforge.context.Named +import hep.dataforge.io.EnvelopeFormat.Companion.ENVELOPE_FORMAT_TYPE +import hep.dataforge.meta.Meta +import hep.dataforge.provider.Type +import kotlinx.io.core.Input +import kotlinx.io.core.Output + +/** + * A partially read envelope with meta, but without data + */ +@ExperimentalUnsignedTypes +data class PartialEnvelope(val meta: Meta, val dataOffset: UInt, val dataSize: ULong?) + +@Type(ENVELOPE_FORMAT_TYPE) +interface EnvelopeFormat : IOFormat, Named { + fun Input.readPartial(formats: Collection = IOPlugin.defaultMetaFormats): PartialEnvelope + + fun Input.readEnvelope(formats: Collection = IOPlugin.defaultMetaFormats): Envelope + + override fun Input.readThis(): Envelope = readEnvelope() + + fun Output.writeEnvelope(envelope: Envelope, format: MetaFormat = JsonMetaFormat) + + override fun Output.writeThis(obj: Envelope) = writeEnvelope(obj) + + companion object { + const val ENVELOPE_FORMAT_TYPE = "envelopeFormat" + } +} \ No newline at end of file diff --git a/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/FunctionServer.kt b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/FunctionServer.kt new file mode 100644 index 00000000..2b254716 --- /dev/null +++ b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/FunctionServer.kt @@ -0,0 +1,38 @@ +package hep.dataforge.io + +import kotlin.reflect.KClass + +/** + * A descriptor for specific type of functions + */ +interface FunctionSpec { + val inputType: KClass + val outputType: KClass +} + +/** + * A server that could produce asynchronous function values + */ +interface FunctionServer { + /** + * Call a function with given name and descriptor + */ + suspend fun > call(name: String, descriptor: D, arg: T): R + + /** + * Resolve a function descriptor for given types + */ + fun resolveType(inputType: KClass, outputType: KClass): FunctionSpec + + /** + * Get a generic suspended function with given name and descriptor + */ + operator fun > get(name: String, descriptor: D): (suspend (T) -> R) = + { call(name, descriptor, it) } +} + +suspend inline fun FunctionServer.call(name: String, arg: T): R = + call(name, resolveType(T::class, R::class), arg) + +inline operator fun FunctionServer.get(name: String): (suspend (T) -> R) = + get(name, resolveType(T::class, R::class)) diff --git a/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/IOFormat.kt b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/IOFormat.kt index ce58c05d..9cc9a584 100644 --- a/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/IOFormat.kt +++ b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/IOFormat.kt @@ -1,10 +1,14 @@ package hep.dataforge.io -import kotlinx.io.core.Input -import kotlinx.io.core.Output - +import kotlinx.io.core.* +/** + * And interface for serialization facilities + */ interface IOFormat { - fun write(obj: T, out: Output) - fun read(input: Input): T -} \ No newline at end of file + fun Output.writeThis(obj: T) + fun Input.readThis(): T +} + +fun IOFormat.writePacket(obj: T): ByteReadPacket = buildPacket { writeThis(obj) } +fun IOFormat.writeBytes(obj: T): ByteArray = buildPacket { writeThis(obj) }.readBytes() \ No newline at end of file diff --git a/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/IOPlugin.kt b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/IOPlugin.kt new file mode 100644 index 00000000..b65e4982 --- /dev/null +++ b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/IOPlugin.kt @@ -0,0 +1,37 @@ +package hep.dataforge.io + +import hep.dataforge.context.AbstractPlugin +import hep.dataforge.context.PluginFactory +import hep.dataforge.context.PluginTag +import hep.dataforge.context.content +import hep.dataforge.meta.Meta +import hep.dataforge.names.Name +import kotlin.reflect.KClass + +class IOPlugin(meta: Meta) : AbstractPlugin(meta) { + override val tag: PluginTag get() = Companion.tag + + val metaFormats by lazy { + context.content(MetaFormat.META_FORMAT_TYPE).values + } + + fun metaFormat(key: Short): MetaFormat? = metaFormats.find { it.key == key } + fun metaFormat(name: String): MetaFormat? = metaFormats.find { it.name == name } + + override fun provideTop(target: String): Map { + return when (target) { + MetaFormat.META_FORMAT_TYPE -> defaultMetaFormats.toMap() + EnvelopeFormat.ENVELOPE_FORMAT_TYPE -> defaultEnvelopeFormats.toMap() + else -> super.provideTop(target) + } + } + + companion object : PluginFactory { + val defaultMetaFormats: List = listOf(JsonMetaFormat, BinaryMetaFormat) + val defaultEnvelopeFormats = listOf(TaggedEnvelopeFormat) + + override val tag: PluginTag = PluginTag("io", group = PluginTag.DATAFORGE_GROUP) + override val type: KClass = IOPlugin::class + override fun invoke(meta: Meta): IOPlugin = IOPlugin(meta) + } +} \ No newline at end of file diff --git a/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/JsonMetaFormat.kt b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/JsonMetaFormat.kt index 9d7b5739..64b368e2 100644 --- a/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/JsonMetaFormat.kt +++ b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/JsonMetaFormat.kt @@ -1,6 +1,10 @@ package hep.dataforge.io +import hep.dataforge.descriptors.ItemDescriptor +import hep.dataforge.descriptors.NodeDescriptor +import hep.dataforge.descriptors.ValueDescriptor import hep.dataforge.meta.Meta +import hep.dataforge.meta.MetaBase import hep.dataforge.meta.MetaItem import hep.dataforge.names.NameToken import hep.dataforge.names.toName @@ -10,17 +14,23 @@ import kotlinx.io.core.Output import kotlinx.io.core.readText import kotlinx.io.core.writeText import kotlinx.serialization.json.* +import kotlin.collections.component1 +import kotlin.collections.component2 +import kotlin.collections.set object JsonMetaFormat : MetaFormat { - override fun write(obj: Meta, out: Output) { - val str = obj.toJson().toString() - out.writeText(str) + override val name: String = "json" + override val key: Short = 0x4a53//"JS" + + override fun Output.writeMeta(meta: Meta, descriptor: NodeDescriptor?) { + val json = meta.toJson(descriptor) + writeText(json.toString()) } - override fun read(input: Input): Meta { - val str = input.readText() + override fun Input.readMeta(descriptor: NodeDescriptor?): Meta { + val str = readText() val json = Json.plain.parseJson(str) if (json is JsonObject) { @@ -31,8 +41,8 @@ object JsonMetaFormat : MetaFormat { } } -fun Value.toJson(): JsonElement { - return if(isList()){ +fun Value.toJson(descriptor: ValueDescriptor? = null): JsonElement { + return if (isList()) { JsonArray(list.map { it.toJson() }) } else { when (type) { @@ -44,48 +54,96 @@ fun Value.toJson(): JsonElement { } } -fun Meta.toJson(): JsonObject { - val map = this.items.mapValues { entry -> - val value = entry.value - when (value) { - is MetaItem.ValueItem -> value.value.toJson() - is MetaItem.NodeItem -> value.node.toJson() +//Use theese methods to customize JSON key mapping +private fun NameToken.toJsonKey(descriptor: ItemDescriptor?) = toString() + +private fun NodeDescriptor?.getDescriptor(key: String) = this?.items?.get(key) + +fun Meta.toJson(descriptor: NodeDescriptor? = null): JsonObject { + + //TODO search for same name siblings and arrange them into arrays + val map = this.items.entries.associate { (name, item) -> + val itemDescriptor = descriptor?.items?.get(name.body) + val key = name.toJsonKey(itemDescriptor) + val value = when (item) { + is MetaItem.ValueItem -> { + item.value.toJson(itemDescriptor as? ValueDescriptor) + } + is MetaItem.NodeItem -> { + item.node.toJson(itemDescriptor as? NodeDescriptor) + } } - }.mapKeys { it.key.toString() } + key to value + } return JsonObject(map) } +fun JsonObject.toMeta(descriptor: NodeDescriptor? = null): JsonMeta = JsonMeta(this, descriptor) -fun JsonObject.toMeta() = JsonMeta(this) - -class JsonMeta(val json: JsonObject) : Meta { +fun JsonPrimitive.toValue(descriptor: ValueDescriptor?): Value { + return when (this) { + JsonNull -> Null + else -> this.content.parseValue() // Optimize number and boolean parsing + } +} - private fun JsonPrimitive.toValue(): Value { - return when (this) { - JsonNull -> Null - else -> this.content.parseValue() // Optimize number and boolean parsing +fun JsonElement.toMetaItem(descriptor: ItemDescriptor? = null): MetaItem = when (this) { + is JsonPrimitive -> { + val value = this.toValue(descriptor as? ValueDescriptor) + MetaItem.ValueItem(value) + } + is JsonObject -> { + val meta = toMeta(descriptor as? NodeDescriptor) + MetaItem.NodeItem(meta) + } + is JsonArray -> { + if (this.all { it is JsonPrimitive }) { + val value = if (isEmpty()) { + Null + } else { + ListValue( + map { + //We already checked that all values are primitives + (it as JsonPrimitive).toValue(descriptor as? ValueDescriptor) + } + ) + } + MetaItem.ValueItem(value) + } else { + json { + "@value" to this@toMetaItem + }.toMetaItem(descriptor) } } +} - private operator fun MutableMap>.set(key: String, value: JsonElement) = when (value) { - is JsonPrimitive -> this[key] = MetaItem.ValueItem(value.toValue()) - is JsonObject -> this[key] = MetaItem.NodeItem(value.toMeta()) - is JsonArray -> { - when { - value.all { it is JsonPrimitive } -> { - val listValue = ListValue( - value.map { - //We already checked that all values are primitives - (it as JsonPrimitive).toValue() - } - ) - this[key] = MetaItem.ValueItem(listValue) - } - else -> value.forEachIndexed { index, jsonElement -> - when (jsonElement) { - is JsonObject -> this["$key[$index]"] = MetaItem.NodeItem(JsonMeta(jsonElement)) - is JsonPrimitive -> this["$key[$index]"] = MetaItem.ValueItem(jsonElement.toValue()) - is JsonArray -> TODO("Nested arrays not supported") +class JsonMeta(val json: JsonObject, val descriptor: NodeDescriptor? = null) : MetaBase() { + + @Suppress("UNCHECKED_CAST") + private operator fun MutableMap>.set(key: String, value: JsonElement): Unit { + val itemDescriptor = descriptor.getDescriptor(key) + //use name from descriptor in case descriptor name differs from json key + val name = itemDescriptor?.name ?: key + return when (value) { + is JsonPrimitive -> { + this[name] = MetaItem.ValueItem(value.toValue(itemDescriptor as? ValueDescriptor)) as MetaItem + } + is JsonObject -> { + this[name] = MetaItem.NodeItem(value.toMeta(itemDescriptor as? NodeDescriptor)) + } + is JsonArray -> { + when { + value.all { it is JsonPrimitive } -> { + val listValue = ListValue( + value.map { + //We already checked that all values are primitives + (it as JsonPrimitive).toValue(itemDescriptor as? ValueDescriptor) + } + ) + this[name] = MetaItem.ValueItem(listValue) as MetaItem + } + else -> value.forEachIndexed { index, jsonElement -> + this["$name[$index]"] = jsonElement.toMetaItem(itemDescriptor) } } } @@ -97,11 +155,4 @@ class JsonMeta(val json: JsonObject) : Meta { json.forEach { (key, value) -> map[key] = value } map.mapKeys { it.key.toName().first()!! } } -} - -class JsonMetaFormatFactory : MetaFormatFactory { - override val name: String = "json" - override val key: Short = 0x4a53//"JS" - - override fun build() = JsonMetaFormat } \ No newline at end of file diff --git a/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/MetaFormat.kt b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/MetaFormat.kt index 61969536..3185e29e 100644 --- a/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/MetaFormat.kt +++ b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/MetaFormat.kt @@ -1,33 +1,49 @@ package hep.dataforge.io +import hep.dataforge.context.Named +import hep.dataforge.descriptors.NodeDescriptor +import hep.dataforge.io.MetaFormat.Companion.META_FORMAT_TYPE import hep.dataforge.meta.Meta -import kotlinx.io.core.BytePacketBuilder -import kotlinx.io.core.ByteReadPacket -import kotlinx.io.core.toByteArray +import hep.dataforge.provider.Type +import kotlinx.io.core.* /** * A format for meta serialization */ -interface MetaFormat: IOFormat - -/** - * ServiceLoader compatible factory - */ -interface MetaFormatFactory { - val name: String +@Type(META_FORMAT_TYPE) +interface MetaFormat : IOFormat, Named { + override val name: String val key: Short - fun build(): MetaFormat + override fun Output.writeThis(obj: Meta) { + writeMeta(obj, null) + } + + override fun Input.readThis(): Meta = readMeta(null) + + fun Output.writeMeta(meta: Meta, descriptor: NodeDescriptor? = null) + fun Input.readMeta(descriptor: NodeDescriptor? = null): Meta + + companion object{ + const val META_FORMAT_TYPE = "metaFormat" + } } -fun Meta.asString(format: MetaFormat = JsonMetaFormat): String { - val builder = BytePacketBuilder() - format.write(this, builder) - return builder.build().readText() +fun Meta.toString(format: MetaFormat): String = buildPacket { + format.run { writeThis(this@toString) } +}.readText() + +fun Meta.toBytes(format: MetaFormat = JsonMetaFormat): ByteReadPacket = buildPacket { + format.run { writeThis(this@toBytes) } } + fun MetaFormat.parse(str: String): Meta { - return read(ByteReadPacket(str.toByteArray())) + return ByteReadPacket(str.toByteArray()).readThis() +} + +fun MetaFormat.fromBytes(packet: ByteReadPacket): Meta { + return packet.readThis() } diff --git a/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/MetaSerializer.kt b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/MetaSerializer.kt new file mode 100644 index 00000000..e1a754eb --- /dev/null +++ b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/MetaSerializer.kt @@ -0,0 +1,53 @@ +package hep.dataforge.io + +import hep.dataforge.meta.Config +import hep.dataforge.meta.Meta +import hep.dataforge.meta.toConfig +import hep.dataforge.names.Name +import hep.dataforge.names.toName +import kotlinx.serialization.* +import kotlinx.serialization.internal.StringDescriptor +import kotlinx.serialization.json.JsonObjectSerializer + +@Serializer(Name::class) +object NameSerializer : KSerializer { + override val descriptor: SerialDescriptor = StringDescriptor + + override fun deserialize(decoder: Decoder): Name { + return decoder.decodeString().toName() + } + + override fun serialize(encoder: Encoder, obj: Name) { + encoder.encodeString(obj.toString()) + } +} + +/** + * Serialized for meta + */ +@Serializer(Meta::class) +object MetaSerializer : KSerializer { + override val descriptor: SerialDescriptor = JsonObjectSerializer.descriptor + + override fun deserialize(decoder: Decoder): Meta { + //currently just delegates serialization to json serializer + return JsonObjectSerializer.deserialize(decoder).toMeta() + } + + override fun serialize(encoder: Encoder, obj: Meta) { + JsonObjectSerializer.serialize(encoder, obj.toJson()) + } +} + +@Serializer(Config::class) +object ConfigSerializer : KSerializer { + override val descriptor: SerialDescriptor = JsonObjectSerializer.descriptor + + override fun deserialize(decoder: Decoder): Config { + return JsonObjectSerializer.deserialize(decoder).toMeta().toConfig() + } + + override fun serialize(encoder: Encoder, obj: Config) { + JsonObjectSerializer.serialize(encoder, obj.toJson()) + } +} \ No newline at end of file diff --git a/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/TaggedEnvelopeFormat.kt b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/TaggedEnvelopeFormat.kt new file mode 100644 index 00000000..291f539d --- /dev/null +++ b/dataforge-io/src/commonMain/kotlin/hep/dataforge/io/TaggedEnvelopeFormat.kt @@ -0,0 +1,81 @@ +package hep.dataforge.io + +import kotlinx.io.core.* + + +@ExperimentalUnsignedTypes +object TaggedEnvelopeFormat : EnvelopeFormat { + const val VERSION = "DF03" + private const val START_SEQUENCE = "#~" + private const val END_SEQUENCE = "~#\r\n" + private const val TAG_SIZE = 26u + + override val name: String get() = VERSION + + private fun Tag.toBytes(): ByteReadPacket = buildPacket(24) { + writeText(START_SEQUENCE) + writeText(VERSION) + writeShort(metaFormatKey) + writeUInt(metaSize) + writeULong(dataSize) + writeText(END_SEQUENCE) + } + + private fun Input.readTag(): Tag { + val start = readTextExactBytes(2) + if (start != START_SEQUENCE) error("The input is not an envelope") + val version = readTextExactBytes(4) + if (version != VERSION) error("Wrong version of DataForge: expected $VERSION but found $version") + val metaFormatKey = readShort() + val metaLength = readUInt() + val dataLength = readULong() + return Tag(metaFormatKey, metaLength, dataLength) + } + + override fun Output.writeEnvelope(envelope: Envelope, format: MetaFormat) { + val metaBytes = format.writeBytes(envelope.meta) + val tag = Tag(format.key, metaBytes.size.toUInt(), envelope.data?.size ?: 0.toULong()) + writePacket(tag.toBytes()) + writeFully(metaBytes) + envelope.data?.read { copyTo(this@writeEnvelope) } + } + + /** + * Read an envelope from input into memory + * + * @param input an input to read from + * @param formats a collection of meta formats to resolve + */ + override fun Input.readEnvelope(formats: Collection): Envelope { + val tag = readTag() + + val metaFormat = formats.find { it.key == tag.metaFormatKey } + ?: error("Meta format with key ${tag.metaFormatKey} not found") + + val metaPacket = ByteReadPacket(readBytes(tag.metaSize.toInt())) + val meta = metaFormat.run { metaPacket.readThis() } + + val dataBytes = readBytes(tag.dataSize.toInt()) + + return SimpleEnvelope(meta, ArrayBinary(dataBytes)) + } + + override fun Input.readPartial(formats: Collection): PartialEnvelope { + val tag = readTag() + + val metaFormat = formats.find { it.key == tag.metaFormatKey } + ?: error("Meta format with key ${tag.metaFormatKey} not found") + + val metaPacket = ByteReadPacket(readBytes(tag.metaSize.toInt())) + val meta = metaFormat.run { metaPacket.readThis() } + + return PartialEnvelope(meta, TAG_SIZE + tag.metaSize, tag.dataSize) + } + + private data class Tag( + val metaFormatKey: Short, + val metaSize: UInt, + val dataSize: ULong + ) + +} \ No newline at end of file diff --git a/dataforge-io/src/commonTest/kotlin/hep/dataforge/io/MetaFormatTest.kt b/dataforge-io/src/commonTest/kotlin/hep/dataforge/io/MetaFormatTest.kt index e177e4d8..16d946e3 100644 --- a/dataforge-io/src/commonTest/kotlin/hep/dataforge/io/MetaFormatTest.kt +++ b/dataforge-io/src/commonTest/kotlin/hep/dataforge/io/MetaFormatTest.kt @@ -1,6 +1,9 @@ package hep.dataforge.io -import hep.dataforge.meta.buildMeta +import hep.dataforge.meta.* +import kotlinx.serialization.json.JsonPrimitive +import kotlinx.serialization.json.json +import kotlinx.serialization.json.jsonArray import kotlin.test.Test import kotlin.test.assertEquals @@ -12,10 +15,11 @@ class MetaFormatTest { "node" to { "b" to "DDD" "c" to 11.1 + "array" to doubleArrayOf(1.0, 2.0, 3.0) } } - val string = meta.asString(BinaryMetaFormat) - val result = BinaryMetaFormat.parse(string) + val bytes = meta.toBytes(BinaryMetaFormat) + val result = BinaryMetaFormat.fromBytes(bytes) assertEquals(meta, result) } @@ -26,12 +30,44 @@ class MetaFormatTest { "node" to { "b" to "DDD" "c" to 11.1 - "array" to doubleArrayOf(1.0,2.0,3.0) + "array" to doubleArrayOf(1.0, 2.0, 3.0) } } - val string = meta.asString(JsonMetaFormat) + val string = meta.toString(JsonMetaFormat) val result = JsonMetaFormat.parse(string) - assertEquals(meta, result) + + assertEquals(meta, meta.seal()) + + meta.items.keys.forEach { + if (meta[it] != result[it]) error("${meta[it]} != ${result[it]}") + } + + assertEquals(meta, result) + } + + @Test + fun testJsonToMeta(){ + val json = jsonArray{ + //top level array + +jsonArray { + +JsonPrimitive(88) + +json{ + "c" to "aasdad" + "d" to true + } + } + +"value" + +jsonArray { + +JsonPrimitive(1.0) + +JsonPrimitive(2.0) + +JsonPrimitive(3.0) + } + } + val meta = json.toMetaItem().node!! + + assertEquals(true, meta["@value[0].@value[1].d"].boolean) + assertEquals("value", meta["@value[1]"].string) + assertEquals(listOf(1.0,2.0,3.0),meta["@value[2"].value?.list?.map{it.number.toDouble()}) } } \ No newline at end of file diff --git a/dataforge-io/src/commonTest/kotlin/hep/dataforge/io/MetaSerializerTest.kt b/dataforge-io/src/commonTest/kotlin/hep/dataforge/io/MetaSerializerTest.kt new file mode 100644 index 00000000..ea5854d2 --- /dev/null +++ b/dataforge-io/src/commonTest/kotlin/hep/dataforge/io/MetaSerializerTest.kt @@ -0,0 +1,33 @@ +package hep.dataforge.io + +import hep.dataforge.meta.buildMeta +import hep.dataforge.names.toName +import kotlinx.serialization.json.Json +import kotlin.test.Test +import kotlin.test.assertEquals + +class MetaSerializerTest { + @Test + fun testMetaSerialization() { + val meta = buildMeta { + "a" to 22 + "node" to { + "b" to "DDD" + "c" to 11.1 + "array" to doubleArrayOf(1.0, 2.0, 3.0) + } + } + + val string = Json.indented.stringify(MetaSerializer, meta) + val restored = Json.plain.parse(MetaSerializer, string) + assertEquals(restored, meta) + } + + @Test + fun testNameSerialization() { + val name = "a.b.c".toName() + val string = Json.indented.stringify(NameSerializer, name) + val restored = Json.plain.parse(NameSerializer, string) + assertEquals(restored, name) + } +} \ No newline at end of file diff --git a/dataforge-io/src/jvmMain/kotlin/hep/dataforge/io/FileBinary.kt b/dataforge-io/src/jvmMain/kotlin/hep/dataforge/io/FileBinary.kt new file mode 100644 index 00000000..038281d4 --- /dev/null +++ b/dataforge-io/src/jvmMain/kotlin/hep/dataforge/io/FileBinary.kt @@ -0,0 +1,21 @@ +package hep.dataforge.io + +import kotlinx.io.core.ByteReadPacket +import kotlinx.io.core.Input +import java.nio.channels.FileChannel +import java.nio.file.Files +import java.nio.file.Path +import java.nio.file.StandardOpenOption + +@ExperimentalUnsignedTypes +class FileBinary(val path: Path, private val offset: UInt = 0u, size: ULong? = null) : RandomAccessBinary { + + override val size: ULong = size ?: (Files.size(path).toULong() - offset).toULong() + + override fun read(from: UInt, size: UInt, block: Input.() -> R): R { + FileChannel.open(path, StandardOpenOption.READ).use { + val buffer = it.map(FileChannel.MapMode.READ_ONLY, (from + offset).toLong(), size.toLong()) + return ByteReadPacket(buffer).block() + } + } +} \ No newline at end of file diff --git a/dataforge-io/src/jvmMain/kotlin/hep/dataforge/io/FileEnvelope.kt b/dataforge-io/src/jvmMain/kotlin/hep/dataforge/io/FileEnvelope.kt new file mode 100644 index 00000000..0c54012c --- /dev/null +++ b/dataforge-io/src/jvmMain/kotlin/hep/dataforge/io/FileEnvelope.kt @@ -0,0 +1,43 @@ +package hep.dataforge.io + +import hep.dataforge.meta.Meta +import kotlinx.io.nio.asInput +import kotlinx.io.nio.asOutput +import java.nio.file.Files +import java.nio.file.Path +import java.nio.file.StandardOpenOption + +class FileEnvelope internal constructor(val path: Path, val format: EnvelopeFormat) : Envelope { + //TODO do not like this constructor. Hope to replace it later + + private val partialEnvelope: PartialEnvelope + + init { + val input = Files.newByteChannel(path, StandardOpenOption.READ).asInput() + partialEnvelope = format.run { input.readPartial() } + } + + override val meta: Meta get() = partialEnvelope.meta + + override val data: Binary? = FileBinary(path, partialEnvelope.dataOffset, partialEnvelope.dataSize) +} + +fun Path.readEnvelope(format: EnvelopeFormat) = FileEnvelope(this, format) + +fun Path.writeEnvelope( + envelope: Envelope, + format: EnvelopeFormat = TaggedEnvelopeFormat, + metaFormat: MetaFormat = JsonMetaFormat +) { + val output = Files.newByteChannel( + this, + StandardOpenOption.WRITE, + StandardOpenOption.CREATE, + StandardOpenOption.TRUNCATE_EXISTING + ).asOutput() + + with(format) { + output.writeEnvelope(envelope, metaFormat) + } +} + diff --git a/dataforge-meta/build.gradle.kts b/dataforge-meta/build.gradle.kts index 17aaebf8..6f2a5160 100644 --- a/dataforge-meta/build.gradle.kts +++ b/dataforge-meta/build.gradle.kts @@ -1,10 +1,5 @@ plugins { - `npm-multiplatform` + id("scientifik.mpp") } -description = "Meta definition and basic operations on meta" - -kotlin { - jvm() - js() -} \ No newline at end of file +description = "Meta definition and basic operations on meta" \ No newline at end of file diff --git a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/descriptors/Described.kt b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/descriptors/Described.kt index a0780a3e..f355c828 100644 --- a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/descriptors/Described.kt +++ b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/descriptors/Described.kt @@ -1,5 +1,29 @@ package hep.dataforge.descriptors +import hep.dataforge.descriptors.Described.Companion.DESCRIPTOR_NODE +import hep.dataforge.meta.Meta +import hep.dataforge.meta.get +import hep.dataforge.meta.node + +/** + * An object which provides its descriptor + */ interface Described { val descriptor: NodeDescriptor -} \ No newline at end of file + + companion object { + const val DESCRIPTOR_NODE = "@descriptor" + } +} + +/** + * If meta node supplies explicit descriptor, return it, otherwise try to use descriptor node from meta itself + */ +val Meta.descriptor: NodeDescriptor? + get() { + return if (this is Described) { + descriptor + } else { + get(DESCRIPTOR_NODE).node?.let { NodeDescriptor.wrap(it) } + } + } \ No newline at end of file diff --git a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/descriptors/ValueDescriptor.kt b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/descriptors/ItemDescriptor.kt similarity index 61% rename from dataforge-meta/src/commonMain/kotlin/hep/dataforge/descriptors/ValueDescriptor.kt rename to dataforge-meta/src/commonMain/kotlin/hep/dataforge/descriptors/ItemDescriptor.kt index e3acd07b..09e4d11e 100644 --- a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/descriptors/ValueDescriptor.kt +++ b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/descriptors/ItemDescriptor.kt @@ -1,74 +1,160 @@ -/* - * Copyright 2018 Alexander Nozik. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - package hep.dataforge.descriptors import hep.dataforge.meta.* +import hep.dataforge.names.NameToken +import hep.dataforge.names.toName import hep.dataforge.values.False import hep.dataforge.values.True import hep.dataforge.values.Value import hep.dataforge.values.ValueType -/** - * A descriptor for meta value - * - * Descriptor can have non-atomic path. It is resolved when descriptor is added to the node - * - * @author Alexander Nozik - */ -class ValueDescriptor(override val config: Config) : Specific { +sealed class ItemDescriptor(override val config: Config) : Specific { /** - * The default for this value. Null if there is no default. + * The name of this item * * @return */ - var default: Value? by value() + var name: String by string { error("Anonymous descriptors are not allowed") } - fun default(v: Any) { - this.default = Value.of(v) + /** + * True if same name siblings with this name are allowed + * + * @return + */ + var multiple: Boolean by boolean(false) + + /** + * The item description + * + * @return + */ + var info: String? by string() + + /** + * A list of tags for this item. Tags used to customize item usage + * + * @return + */ + var tags: List by value { value -> + value?.list?.map { it.string } ?: emptyList() } /** - * True if multiple values with this name are allowed. + * True if the item is required * * @return */ - var multiple: Boolean by boolean(false) + abstract var required: Boolean +} + +/** + * Descriptor for meta node. Could contain additional information for viewing + * and editing. + * + * @author Alexander Nozik + */ +class NodeDescriptor(config: Config) : ItemDescriptor(config){ /** - * True if the value is required + * True if the node is required * * @return */ - var required: Boolean by boolean { default == null } + override var required: Boolean by boolean { default == null } /** - * Value name + * The default for this node. Null if there is no default. * * @return */ - var name: String by string { error("Anonymous descriptors are not allowed") } + var default: Meta? by node() + + /** + * The list of value descriptors + */ + val values: Map + get() = config.getAll(VALUE_KEY.toName()).entries.associate { (name, node) -> + name to ValueDescriptor.wrap(node.node ?: error("Value descriptor must be a node")) + } + + fun value(name: String, descriptor: ValueDescriptor) { + if(items.keys.contains(name)) error("The key $name already exists in descriptor") + val token = NameToken(VALUE_KEY, name) + config[token] = descriptor.config + } /** - * The value info + * Add a value descriptor using block for + */ + fun value(name: String, block: ValueDescriptor.() -> Unit) { + value(name, ValueDescriptor.build { this.name = name }.apply(block)) + } + + /** + * The map of children node descriptors + */ + val nodes: Map + get() = config.getAll(NODE_KEY.toName()).entries.associate { (name, node) -> + name to wrap(node.node ?: error("Node descriptor must be a node")) + } + + + fun node(name: String, descriptor: NodeDescriptor) { + if(items.keys.contains(name)) error("The key $name already exists in descriptor") + val token = NameToken(NODE_KEY, name) + config[token] = descriptor.config + } + + fun node(name: String, block: NodeDescriptor.() -> Unit) { + node(name, build { this.name = name }.apply(block)) + } + + val items: Map get() = nodes + values + + + //override val descriptor: NodeDescriptor = empty("descriptor") + + companion object : Specification { + +// const val ITEM_KEY = "item" + const val NODE_KEY = "node" + const val VALUE_KEY = "value" + + override fun wrap(config: Config): NodeDescriptor = NodeDescriptor(config) + + //TODO infer descriptor from spec + } +} + + +/** + * A descriptor for meta value + * + * Descriptor can have non-atomic path. It is resolved when descriptor is added to the node + * + * @author Alexander Nozik + */ +class ValueDescriptor(config: Config) : ItemDescriptor(config){ + + + /** + * True if the value is required * * @return */ - var info: String? by string() + override var required: Boolean by boolean { default == null } + + /** + * The default for this value. Null if there is no default. + * + * @return + */ + var default: Value? by value() + + fun default(v: Any) { + this.default = Value.of(v) + } /** * A list of allowed ValueTypes. Empty if any value type allowed @@ -83,10 +169,6 @@ class ValueDescriptor(override val config: Config) : Specific { this.type = listOf(*t) } - var tags: List by value { value -> - value?.list?.map { it.string } ?: emptyList() - } - /** * Check if given value is allowed for here. The type should be allowed and * if it is value should be within allowed values @@ -126,7 +208,7 @@ class ValueDescriptor(override val config: Config) : Specific { override fun wrap(config: Config): ValueDescriptor = ValueDescriptor(config) inline fun > enum(name: String) = - ValueDescriptor.build { + build { this.name = name type(ValueType.STRING) this.allowedValues = enumValues().map { Value.of(it.name) } diff --git a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/descriptors/NodeDescriptor.kt b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/descriptors/NodeDescriptor.kt deleted file mode 100644 index 902c81a3..00000000 --- a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/descriptors/NodeDescriptor.kt +++ /dev/null @@ -1,128 +0,0 @@ -/* - * Copyright 2018 Alexander Nozik. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -/* - * To change this license header, choose License Headers in Project Properties. - * To change this template file, choose Tools | Templates - * and open the template in the editor. - */ -package hep.dataforge.descriptors - -import hep.dataforge.meta.* -import hep.dataforge.names.NameToken -import hep.dataforge.names.toName - -/** - * Descriptor for meta node. Could contain additional information for viewing - * and editing. - * - * @author Alexander Nozik - */ -class NodeDescriptor(override val config: Config) : Specific { - - /** - * The name of this node - * - * @return - */ - var name: String by string { error("Anonymous descriptors are not allowed") } - - - /** - * The default for this node. Null if there is no default. - * - * @return - */ - var default: Meta? by node() - - /** - * True if multiple children with this nodes name are allowed. Anonymous - * nodes are always single - * - * @return - */ - var multiple: Boolean by boolean(false) - - /** - * True if the node is required - * - * @return - */ - var required: Boolean by boolean { default == null } - - /** - * The node description - * - * @return - */ - var info: String? by string() - - /** - * A list of tags for this node. Tags used to customize node usage - * - * @return - */ - var tags: List by value{ value -> - value?.list?.map { it.string } ?: emptyList() - } - - /** - * The list of value descriptors - */ - val values: Map - get() = config.getAll("value".toName()).entries.associate { (name, node) -> - name to ValueDescriptor.wrap(node.node ?: error("Value descriptor must be a node")) - } - - fun value(name: String, descriptor: ValueDescriptor) { - val token = NameToken("value", name) - config[token] = descriptor.config - } - - /** - * Add a value descriptor using block for - */ - fun value(name: String, block: ValueDescriptor.() -> Unit) { - value(name, ValueDescriptor.build { this.name = name }.apply(block)) - } - - /** - * The map of children node descriptors - */ - val nodes: Map - get() = config.getAll("node".toName()).entries.associate { (name, node) -> - name to NodeDescriptor.wrap(node.node ?: error("Node descriptor must be a node")) - } - - - fun node(name: String, descriptor: NodeDescriptor) { - val token = NameToken("node", name) - config[token] = descriptor.config - } - - fun node(name: String, block: NodeDescriptor.() -> Unit) { - node(name, NodeDescriptor.build { this.name = name }.apply(block)) - } - - - //override val descriptor: NodeDescriptor = empty("descriptor") - - companion object : Specification { - - override fun wrap(config: Config): NodeDescriptor = NodeDescriptor(config) - - } -} diff --git a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Config.kt b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Config.kt index 1f9a3522..f47d3bcd 100644 --- a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Config.kt +++ b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Config.kt @@ -3,18 +3,61 @@ package hep.dataforge.meta import hep.dataforge.names.Name import hep.dataforge.names.NameToken import hep.dataforge.names.asName +import hep.dataforge.names.plus //TODO add validator to configuration +data class MetaListener( + val owner: Any? = null, + val action: (name: Name, oldItem: MetaItem<*>?, newItem: MetaItem<*>?) -> Unit +) + /** * Mutable meta representing object state */ -open class Config : MutableMetaNode() { +class Config : AbstractMutableMeta() { + + private val listeners = HashSet() + + private fun itemChanged(name: Name, oldItem: MetaItem<*>?, newItem: MetaItem<*>?) { + listeners.forEach { it.action(name, oldItem, newItem) } + } + + /** + * Add change listener to this meta. Owner is declared to be able to remove listeners later. Listener without owner could not be removed + */ + fun onChange(owner: Any?, action: (Name, MetaItem<*>?, MetaItem<*>?) -> Unit) { + listeners.add(MetaListener(owner, action)) + } + + /** + * Remove all listeners belonging to given owner + */ + fun removeListener(owner: Any?) { + listeners.removeAll { it.owner === owner } + } + + override fun replaceItem(key: NameToken, oldItem: MetaItem?, newItem: MetaItem?) { + if (newItem == null) { + _items.remove(key) + if(oldItem!= null && oldItem is MetaItem.NodeItem) { + oldItem.node.removeListener(this) + } + } else { + _items[key] = newItem + if (newItem is MetaItem.NodeItem) { + newItem.node.onChange(this) { name, oldChild, newChild -> + itemChanged(key + name, oldChild, newChild) + } + } + } + itemChanged(key.asName(), oldItem, newItem) + } /** * Attach configuration node instead of creating one */ - override fun wrap(name: Name, meta: Meta): Config = meta.toConfig() + override fun wrapNode(meta: Meta): Config = meta.toConfig() override fun empty(): Config = Config() @@ -29,7 +72,7 @@ fun Meta.toConfig(): Config = this as? Config ?: Config().also { builder -> this.items.mapValues { entry -> val item = entry.value builder[entry.key.asName()] = when (item) { - is MetaItem.ValueItem -> MetaItem.ValueItem(item.value) + is MetaItem.ValueItem -> item.value is MetaItem.NodeItem -> MetaItem.NodeItem(item.node.toConfig()) } } @@ -41,6 +84,6 @@ interface Configurable { fun T.configure(meta: Meta): T = this.apply { config.update(meta) } -fun T.configure(action: Config.() -> Unit): T = this.apply { config.apply(action) } +fun T.configure(action: MetaBuilder.() -> Unit): T = configure(buildMeta(action)) open class SimpleConfigurable(override val config: Config) : Configurable \ No newline at end of file diff --git a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/ExtraMetaDelegates.kt b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/ExtraMetaDelegates.kt deleted file mode 100644 index 6e24c6d0..00000000 --- a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/ExtraMetaDelegates.kt +++ /dev/null @@ -1,36 +0,0 @@ -package hep.dataforge.meta - -import kotlin.properties.ReadWriteProperty -import kotlin.reflect.KProperty - -/* - * Extra delegates for special cases - */ - -/** - * A delegate for a string list - */ -class StringListConfigDelegate( - val config: Config, - private val key: String? = null, - private val default: List = emptyList() -) : - ReadWriteProperty> { - override fun getValue(thisRef: Any?, property: KProperty<*>): List { - return config[key ?: property.name]?.value?.list?.map { it.string } ?: default - } - - override fun setValue(thisRef: Any?, property: KProperty<*>, value: List) { - val name = key ?: property.name - config[name] = value - } -} - -fun Configurable.stringList(vararg default: String = emptyArray(), key: String? = null) = - StringListConfigDelegate(config, key, default.toList()) - - -fun Metoid.child(key: String? = null, converter: (Meta) -> T) = ChildDelegate(meta, key, converter) - -fun Configurable.child(key: String? = null, converter: (Meta) -> T) = - MutableMorphDelegate(config, key, converter) diff --git a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Laminate.kt b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Laminate.kt index b16248ac..b76fd46f 100644 --- a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Laminate.kt +++ b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Laminate.kt @@ -7,7 +7,7 @@ import hep.dataforge.names.NameToken * * */ -class Laminate(layers: List) : Meta { +class Laminate(layers: List) : MetaBase() { val layers: List = layers.flatMap { if (it is Laminate) { @@ -17,9 +17,9 @@ class Laminate(layers: List) : Meta { } } - constructor(vararg layers: Meta) : this(layers.asList()) + constructor(vararg layers: Meta?) : this(layers.filterNotNull()) - override val items: Map> + override val items: Map> get() = layers.map { it.items.keys }.flatten().associateWith { key -> layers.asSequence().map { it.items[key] }.filterNotNull().let(replaceRule) } @@ -79,4 +79,14 @@ class Laminate(layers: List) : Meta { } } +/** + * Create a new [Laminate] adding given layer to the top + */ +fun Laminate.withTop(meta: Meta): Laminate = Laminate(listOf(meta) + layers) + +/** + * Create a new [Laminate] adding given layer to the bottom + */ +fun Laminate.withBottom(meta: Meta): Laminate = Laminate(layers + meta) + //TODO add custom rules for Laminate merge diff --git a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Meta.kt b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Meta.kt index e62c3621..9e0dedf9 100644 --- a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Meta.kt +++ b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Meta.kt @@ -14,9 +14,14 @@ import hep.dataforge.values.boolean * * a [ValueItem] (leaf) * * a [NodeItem] (node) */ -sealed class MetaItem { - data class ValueItem(val value: Value) : MetaItem() - data class NodeItem(val node: M) : MetaItem() +sealed class MetaItem { + data class ValueItem(val value: Value) : MetaItem() { + override fun toString(): String = value.toString() + } + + data class NodeItem(val node: M) : MetaItem() { + override fun toString(): String = node.toString() + } } /** @@ -35,10 +40,19 @@ interface MetaRepr { * * Same name siblings are supported via elements with the same [Name] but different queries */ interface Meta : MetaRepr { - val items: Map> + /** + * Top level items of meta tree + */ + val items: Map> override fun toMeta(): Meta = this + override fun equals(other: Any?): Boolean + + override fun hashCode(): Int + + override fun toString(): String + companion object { const val TYPE = "meta" /** @@ -50,12 +64,7 @@ interface Meta : MetaRepr { /* Get operations*/ -/** - * Fast [String]-based accessor for item map - */ -operator fun Map.get(body: String, query: String = ""): T? = get(NameToken(body, query)) - -operator fun Meta?.get(name: Name): MetaItem? { +operator fun Meta?.get(name: Name): MetaItem<*>? { if (this == null) return null return name.first()?.let { token -> val tail = name.cutFirst() @@ -66,13 +75,13 @@ operator fun Meta?.get(name: Name): MetaItem? { } } -operator fun Meta?.get(token: NameToken): MetaItem? = this?.items?.get(token) -operator fun Meta?.get(key: String): MetaItem? = get(key.toName()) +operator fun Meta?.get(token: NameToken): MetaItem<*>? = this?.items?.get(token) +operator fun Meta?.get(key: String): MetaItem<*>? = get(key.toName()) /** * Get all items matching given name. */ -fun Meta.getAll(name: Name): Map> { +fun Meta.getAll(name: Name): Map> { val root = when (name.length) { 0 -> error("Can't use empty name for that") 1 -> this @@ -88,22 +97,37 @@ fun Meta.getAll(name: Name): Map> { ?: emptyMap() } -fun Meta.getAll(name: String): Map> = getAll(name.toName()) +fun Meta.getAll(name: String): Map> = getAll(name.toName()) /** * Get a sequence of [Name]-[Value] pairs */ fun Meta.values(): Sequence> { - return items.asSequence().flatMap { entry -> - val item = entry.value + return items.asSequence().flatMap { (key, item) -> when (item) { - is ValueItem -> sequenceOf(entry.key.asName() to item.value) - is NodeItem -> item.node.values().map { pair -> (entry.key.asName() + pair.first) to pair.second } + is ValueItem -> sequenceOf(key.asName() to item.value) + is NodeItem -> item.node.values().map { pair -> (key.asName() + pair.first) to pair.second } + } + } +} + +/** + * Get a sequence of all [Name]-[MetaItem] pairs for all items including nodes + */ +fun Meta.sequence(): Sequence>> { + return sequence { + items.forEach { (key, item) -> + yield(key.asName() to item) + if (item is NodeItem<*>) { + yieldAll(item.node.sequence().map { (innerKey, innerItem) -> + (key + innerKey) to innerItem + }) + } } } } -operator fun Meta.iterator(): Iterator> = values().iterator() +operator fun Meta.iterator(): Iterator>> = sequence().iterator() /** * A meta node that ensures that all of its descendants has at least the same type @@ -115,7 +139,7 @@ interface MetaNode> : Meta { /** * Get all items matching given name. */ -fun > MetaNode.getAll(name: Name): Map> { +fun > M.getAll(name: Name): Map> { val root: MetaNode? = when (name.length) { 0 -> error("Can't use empty name for that") 1 -> this @@ -133,7 +157,8 @@ fun > MetaNode.getAll(name: Name): Map> { fun > M.getAll(name: String): Map> = getAll(name.toName()) -operator fun > MetaNode.get(name: Name): MetaItem? { +operator fun > MetaNode?.get(name: Name): MetaItem? { + if (this == null) return null return name.first()?.let { token -> val tail = name.cutFirst() when (tail.length) { @@ -143,24 +168,44 @@ operator fun > MetaNode.get(name: Name): MetaItem? { } } -operator fun > MetaNode?.get(key: String): MetaItem? = this?.let { get(key.toName()) } +operator fun > MetaNode?.get(key: String): MetaItem? = if (this == null) { + null +} else { + this[key.toName()] +} + +operator fun > MetaNode?.get(key: NameToken): MetaItem? = if (this == null) { + null +} else { + this[key.asName()] +} /** - * Equals and hash code implementation for meta node + * Equals, hashcode and to string for any meta */ -abstract class AbstractMetaNode> : MetaNode { - override fun equals(other: Any?): Boolean { - if (this === other) return true - if (other !is Meta) return false - - return this.items == other.items +abstract class MetaBase: Meta{ + + override fun equals(other: Any?): Boolean = if(other is Meta) { + this.items == other.items +// val items = items +// val otherItems = other.items +// (items.keys == otherItems.keys) && items.keys.all { +// items[it] == otherItems[it] +// } + } else { + false } - override fun hashCode(): Int { - return items.hashCode() - } + override fun hashCode(): Int = items.hashCode() + + override fun toString(): String = items.toString() } +/** + * Equals and hash code implementation for meta node + */ +abstract class AbstractMetaNode> : MetaNode, MetaBase() + /** * The meta implementation which is guaranteed to be immutable. * @@ -174,13 +219,14 @@ class SealedMeta internal constructor(override val items: Map entry.value.seal() }) +@Suppress("UNCHECKED_CAST") fun MetaItem<*>.seal(): MetaItem = when (this) { - is MetaItem.ValueItem -> MetaItem.ValueItem(value) - is MetaItem.NodeItem -> MetaItem.NodeItem(node.seal()) + is ValueItem -> this + is NodeItem -> NodeItem(node.seal()) } -object EmptyMeta : Meta { - override val items: Map> = emptyMap() +object EmptyMeta : MetaBase() { + override val items: Map> = emptyMap() } /** @@ -188,8 +234,8 @@ object EmptyMeta : Meta { */ val MetaItem<*>?.value - get() = (this as? MetaItem.ValueItem)?.value - ?: (this?.node?.get(VALUE_KEY) as? MetaItem.ValueItem)?.value + get() = (this as? ValueItem)?.value + ?: (this?.node?.get(VALUE_KEY) as? ValueItem)?.value val MetaItem<*>?.string get() = value?.string val MetaItem<*>?.boolean get() = value?.boolean @@ -211,8 +257,8 @@ val MetaItem<*>?.stringList get() = value?.list?.map { it.string } ?: emptyList( val MetaItem?.node: M? get() = when (this) { null -> null - is MetaItem.ValueItem -> error("Trying to interpret value meta item as node item") - is MetaItem.NodeItem -> node + is ValueItem -> error("Trying to interpret value meta item as node item") + is NodeItem -> node } /** @@ -224,4 +270,4 @@ interface Metoid { fun Value.toMeta() = buildMeta { Meta.VALUE_KEY to this } -fun Meta.isEmpty() = this === EmptyMeta || this.items.isEmpty() \ No newline at end of file +fun Meta.isEmpty() = this === EmptyMeta || this.items.isEmpty() diff --git a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/MetaBuilder.kt b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/MetaBuilder.kt index fe6829d4..322b660a 100644 --- a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/MetaBuilder.kt +++ b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/MetaBuilder.kt @@ -7,8 +7,8 @@ import hep.dataforge.values.Value /** * DSL builder for meta. Is not intended to store mutable state */ -class MetaBuilder : MutableMetaNode() { - override fun wrap(name: Name, meta: Meta): MetaBuilder = meta.builder() +class MetaBuilder : AbstractMutableMeta() { + override fun wrapNode(meta: Meta): MetaBuilder = if (meta is MetaBuilder) meta else meta.builder() override fun empty(): MetaBuilder = MetaBuilder() infix fun String.to(value: Any) { @@ -29,6 +29,25 @@ class MetaBuilder : MutableMetaNode() { infix fun String.to(metaBuilder: MetaBuilder.() -> Unit) { this@MetaBuilder[this] = MetaBuilder().apply(metaBuilder) } + + infix fun Name.to(value: Any) { + if (value is Meta) { + this@MetaBuilder[this] = value + } + this@MetaBuilder[this] = Value.of(value) + } + + infix fun Name.to(meta: Meta) { + this@MetaBuilder[this] = meta + } + + infix fun Name.to(value: Iterable) { + this@MetaBuilder[this] = value.toList() + } + + infix fun Name.to(metaBuilder: MetaBuilder.() -> Unit) { + this@MetaBuilder[this] = MetaBuilder().apply(metaBuilder) + } } /** @@ -39,11 +58,19 @@ fun Meta.builder(): MetaBuilder { items.mapValues { entry -> val item = entry.value builder[entry.key.asName()] = when (item) { - is MetaItem.ValueItem -> MetaItem.ValueItem(item.value) + is MetaItem.ValueItem -> item.value is MetaItem.NodeItem -> MetaItem.NodeItem(item.node.builder()) } } } } -fun buildMeta(builder: MetaBuilder.() -> Unit): MetaBuilder = MetaBuilder().apply(builder) \ No newline at end of file +/** + * Build a [MetaBuilder] using given transformation + */ +fun buildMeta(builder: MetaBuilder.() -> Unit): MetaBuilder = MetaBuilder().apply(builder) + +/** + * Build meta using given source meta as a base + */ +fun buildMeta(source: Meta, builder: MetaBuilder.() -> Unit): MetaBuilder = source.builder().apply(builder) \ No newline at end of file diff --git a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/MetaTransformation.kt b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/MetaTransformation.kt new file mode 100644 index 00000000..8deada19 --- /dev/null +++ b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/MetaTransformation.kt @@ -0,0 +1,170 @@ +package hep.dataforge.meta + +import hep.dataforge.names.Name + +/** + * A transformation for meta item or a group of items + */ +interface TransformationRule { + + /** + * Check if this transformation + */ + fun matches(name: Name, item: MetaItem<*>?): Boolean + + /** + * Select all items to be transformed. Item could be a value as well as node + * + * @return a sequence of item paths to be transformed + */ + fun selectItems(meta: Meta): Sequence = + meta.sequence().filter { matches(it.first, it.second) }.map { it.first } + + /** + * Apply transformation for a single item (Node or Value) and return resulting tree with absolute path + */ + fun > transformItem(name: Name, item: MetaItem<*>?, target: M): Unit +} + +/** + * A transformation which keeps all elements, matching [selector] unchanged. + */ +data class KeepTransformationRule(val selector: (Name) -> Boolean) : TransformationRule { + override fun matches(name: Name, item: MetaItem<*>?): Boolean { + return selector(name) + } + + override fun selectItems(meta: Meta): Sequence = + meta.sequence().map { it.first }.filter(selector) + + override fun > transformItem(name: Name, item: MetaItem<*>?, target: M) { + if (selector(name)) target[name] = item + } +} + +/** + * A transformation which transforms element with specific name + */ +data class SingleItemTransformationRule( + val from: Name, + val transform: MutableMeta<*>.(Name, MetaItem<*>?) -> Unit +) : TransformationRule { + override fun matches(name: Name, item: MetaItem<*>?): Boolean { + return name == from + } + + override fun selectItems(meta: Meta): Sequence = sequenceOf(from) + + override fun > transformItem(name: Name, item: MetaItem<*>?, target: M) { + if (name == this.from) { + target.transform(name, item) + } + } +} + +data class RegexItemTransformationRule( + val from: Regex, + val transform: MutableMeta<*>.(name: Name, MatchResult, MetaItem<*>?) -> Unit +) : TransformationRule { + override fun matches(name: Name, item: MetaItem<*>?): Boolean { + return from.matches(name.toString()) + } + + override fun > transformItem(name: Name, item: MetaItem<*>?, target: M) { + val match = from.matchEntire(name.toString()) + if (match != null) { + target.transform(name, match, item) + } + } + +} + +/** + * A set of [TransformationRule] to either transform static meta or create dynamically updated [MutableMeta] + */ +inline class MetaTransformation(val transformations: Collection) { + + /** + * Produce new meta using only those items that match transformation rules + */ + fun transform(source: Meta): Meta = buildMeta { + transformations.forEach { rule -> + rule.selectItems(source).forEach { name -> + rule.transformItem(name, source[name], this) + } + } + } + + /** + * Transform a meta, replacing all elements found in rules with transformed entries + */ + fun apply(source: Meta): Meta = buildMeta(source) { + transformations.forEach { rule -> + rule.selectItems(source).forEach { name -> + remove(name) + rule.transformItem(name, source[name], this) + } + } + } + + /** + * Listens for changes in the source node and translates them into second node if transformation set contains a corresponding rule. + */ + fun > bind(source: Config, target: M) { + source.onChange(target) { name, _, newItem -> + transformations.forEach { t -> + if (t.matches(name, newItem)) { + t.transformItem(name, newItem, target) + } + } + } + } + + companion object { + fun make(block: MetaTransformationBuilder.() -> Unit): MetaTransformation = + MetaTransformationBuilder().apply(block).build() + } +} + +/** + * A builder for a set of transformation rules + */ +class MetaTransformationBuilder { + val transformations = HashSet() + + /** + * Keep all items with name satisfying the criteria + */ + fun keep(selector: (Name) -> Boolean) { + transformations.add(KeepTransformationRule(selector)) + } + + /** + * Keep specific item (including its descendants) + */ + fun keep(name: Name) { + keep { it == name } + } + + /** + * Keep nodes by regex + */ + fun keep(regex: String) { + transformations.add(RegexItemTransformationRule(regex.toRegex()) { name, _, metaItem -> + setItem(name, metaItem) + }) + } + + /** + * Move an item from [from] to [to], optionally applying [operation] it defined + */ + fun move(from: Name, to: Name, operation: (MetaItem<*>?) -> Any? = { it }) { + transformations.add( + SingleItemTransformationRule(from) { _, item -> + set(to, operation(item)) + } + ) + } + + fun build() = MetaTransformation(transformations) +} \ No newline at end of file diff --git a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/MutableMeta.kt b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/MutableMeta.kt index 8182495b..285edd89 100644 --- a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/MutableMeta.kt +++ b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/MutableMeta.kt @@ -3,17 +3,11 @@ package hep.dataforge.meta import hep.dataforge.names.* import hep.dataforge.values.Value -internal data class MetaListener( - val owner: Any? = null, - val action: (name: Name, oldItem: MetaItem<*>?, newItem: MetaItem<*>?) -> Unit -) - - interface MutableMeta> : MetaNode { override val items: Map> - operator fun set(name: Name, item: MetaItem?) - fun onChange(owner: Any? = null, action: (Name, MetaItem<*>?, MetaItem<*>?) -> Unit) - fun removeListener(owner: Any? = null) + operator fun set(name: Name, item: MetaItem<*>?) +// fun onChange(owner: Any? = null, action: (Name, MetaItem<*>?, MetaItem<*>?) -> Unit) +// fun removeListener(owner: Any? = null) } /** @@ -21,111 +15,102 @@ interface MutableMeta> : MetaNode { * * Changes in Meta are not thread safe. */ -abstract class MutableMetaNode> : AbstractMetaNode(), MutableMeta { - private val listeners = HashSet() - - /** - * Add change listener to this meta. Owner is declared to be able to remove listeners later. Listener without owner could not be removed - */ - override fun onChange(owner: Any?, action: (Name, MetaItem<*>?, MetaItem<*>?) -> Unit) { - listeners.add(MetaListener(owner, action)) - } - - /** - * Remove all listeners belonging to given owner - */ - override fun removeListener(owner: Any?) { - listeners.removeAll { it.owner === owner } - } - - private val _items: MutableMap> = HashMap() +abstract class AbstractMutableMeta> : AbstractMetaNode(), MutableMeta { + protected val _items: MutableMap> = HashMap() override val items: Map> get() = _items - protected fun itemChanged(name: Name, oldItem: MetaItem<*>?, newItem: MetaItem<*>?) { - listeners.forEach { it.action(name, oldItem, newItem) } - } + //protected abstract fun itemChanged(name: Name, oldItem: MetaItem<*>?, newItem: MetaItem<*>?) protected open fun replaceItem(key: NameToken, oldItem: MetaItem?, newItem: MetaItem?) { if (newItem == null) { _items.remove(key) - oldItem?.node?.removeListener(this) } else { _items[key] = newItem - if (newItem is MetaItem.NodeItem) { - newItem.node.onChange(this) { name, oldChild, newChild -> - itemChanged(key + name, oldChild, newChild) - } - } } - itemChanged(key.asName(), oldItem, newItem) + //itemChanged(key.asName(), oldItem, newItem) + } + + @Suppress("UNCHECKED_CAST") + protected fun wrapItem(item: MetaItem<*>?): MetaItem? = when (item) { + null -> null + is MetaItem.ValueItem -> item + is MetaItem.NodeItem -> MetaItem.NodeItem(wrapNode(item.node)) } /** * Transform given meta to node type of this meta tree - * @param name the name of the node where meta should be attached. Needed for correct assignment validators and styles - * @param meta the node itself */ - internal abstract fun wrap(name: Name, meta: Meta): M + protected abstract fun wrapNode(meta: Meta): M /** * Create empty node */ internal abstract fun empty(): M - override operator fun set(name: Name, item: MetaItem?) { + override operator fun set(name: Name, item: MetaItem<*>?) { when (name.length) { 0 -> error("Can't setValue meta item for empty name") 1 -> { val token = name.first()!! - replaceItem(token, get(name), item) + replaceItem(token, get(name), wrapItem(item)) } else -> { val token = name.first()!! //get existing or create new node. Query is ignored for new node - val child = this.items[token]?.node - ?: empty().also { this[token.body.toName()] = MetaItem.NodeItem(it) } - child[name.cutFirst()] = item + if(items[token] == null){ + replaceItem(token,null, MetaItem.NodeItem(empty())) + } + items[token]?.node!![name.cutFirst()] = item } } } } -fun > MutableMeta.remove(name: Name) = set(name, null) -fun > MutableMeta.remove(name: String) = remove(name.toName()) -fun > MutableMeta.setValue(name: Name, value: Value) = set(name, MetaItem.ValueItem(value)) -fun > MutableMeta.setItem(name: String, item: MetaItem) = set(name.toName(), item) -fun > MutableMeta.setValue(name: String, value: Value) = +@Suppress("NOTHING_TO_INLINE") +inline fun MutableMeta<*>.remove(name: Name) = set(name, null) +@Suppress("NOTHING_TO_INLINE") +inline fun MutableMeta<*>.remove(name: String) = remove(name.toName()) + +fun MutableMeta<*>.setValue(name: Name, value: Value) = + set(name, MetaItem.ValueItem(value)) + +fun MutableMeta<*>.setValue(name: String, value: Value) = set(name.toName(), MetaItem.ValueItem(value)) -fun > MutableMeta.setItem(token: NameToken, item: MetaItem?) = set(token.asName(), item) +fun MutableMeta<*>.setItem(name: Name, item: MetaItem<*>?) { + when (item) { + null -> remove(name) + is MetaItem.ValueItem -> setValue(name, item.value) + is MetaItem.NodeItem<*> -> setNode(name, item.node) + } +} + +fun MutableMeta<*>.setItem(name: String, item: MetaItem<*>?) = setItem(name.toName(), item) -fun > MutableMetaNode.setNode(name: Name, node: Meta) = - set(name, MetaItem.NodeItem(wrap(name, node))) +fun MutableMeta<*>.setNode(name: Name, node: Meta) = + set(name, MetaItem.NodeItem(node)) -fun > MutableMetaNode.setNode(name: String, node: Meta) = setNode(name.toName(), node) +fun MutableMeta<*>.setNode(name: String, node: Meta) = setNode(name.toName(), node) /** * Universal set method */ -operator fun > M.set(name: Name, value: Any?) { +operator fun MutableMeta<*>.set(name: Name, value: Any?) { when (value) { null -> remove(name) - is MetaItem<*> -> when (value) { - is MetaItem.ValueItem<*> -> setValue(name, value.value) - is MetaItem.NodeItem<*> -> setNode(name, value.node) - } + is MetaItem<*> -> setItem(name, value) is Meta -> setNode(name, value) is Specific -> setNode(name, value.config) else -> setValue(name, Value.of(value)) } } -operator fun > M.set(name: NameToken, value: Any?) = set(name.asName(), value) +operator fun MutableMeta<*>.set(name: NameToken, value: Any?) = set(name.asName(), value) -operator fun > M.set(key: String, value: Any?) = set(key.toName(), value) +operator fun MutableMeta<*>.set(key: String, value: Any?) = set(key.toName(), value) /** * Update existing mutable node with another node. The rules are following: @@ -133,10 +118,9 @@ operator fun > M.set(key: String, value: Any?) = set(key. * * node updates node and replaces anything but node * * node list updates node list if number of nodes in the list is the same and replaces anything otherwise */ -fun > M.update(meta: Meta) { +fun > M.update(meta: Meta) { meta.items.forEach { entry -> - val value = entry.value - when (value) { + when (val value = entry.value) { is MetaItem.ValueItem -> setValue(entry.key.asName(), value.value) is MetaItem.NodeItem -> (this[entry.key.asName()] as? MetaItem.NodeItem)?.node?.update(value.node) ?: run { setNode(entry.key.asName(), value.node) } @@ -146,10 +130,10 @@ fun > M.update(meta: Meta) { /* Same name siblings generation */ -fun > M.setIndexed( +fun MutableMeta<*>.setIndexedItems( name: Name, - items: Iterable>, - indexFactory: MetaItem.(index: Int) -> String = { it.toString() } + items: Iterable>, + indexFactory: MetaItem<*>.(index: Int) -> String = { it.toString() } ) { val tokens = name.tokens.toMutableList() val last = tokens.last() @@ -160,21 +144,21 @@ fun > M.setIndexed( } } -fun > M.setIndexed( +fun MutableMeta<*>.setIndexed( name: Name, metas: Iterable, - indexFactory: MetaItem.(index: Int) -> String = { it.toString() } + indexFactory: MetaItem<*>.(index: Int) -> String = { it.toString() } ) { - setIndexed(name, metas.map { MetaItem.NodeItem(wrap(name, it)) }, indexFactory) + setIndexedItems(name, metas.map { MetaItem.NodeItem(it) }, indexFactory) } -operator fun > M.set(name: Name, metas: Iterable) = setIndexed(name, metas) -operator fun > M.set(name: String, metas: Iterable) = setIndexed(name.toName(), metas) +operator fun MutableMeta<*>.set(name: Name, metas: Iterable): Unit = setIndexed(name, metas) +operator fun MutableMeta<*>.set(name: String, metas: Iterable): Unit = setIndexed(name.toName(), metas) /** * Append the node with a same-name-sibling, automatically generating numerical index */ -fun > M.append(name: Name, value: Any?) { +fun MutableMeta<*>.append(name: Name, value: Any?) { require(!name.isEmpty()) { "Name could not be empty for append operation" } val newIndex = name.last()!!.index if (newIndex.isNotEmpty()) { @@ -185,4 +169,4 @@ fun > M.append(name: Name, value: Any?) { } } -fun > M.append(name: String, value: Any?) = append(name.toName(), value) \ No newline at end of file +fun MutableMeta<*>.append(name: String, value: Any?) = append(name.toName(), value) \ No newline at end of file diff --git a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Specific.kt b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Specific.kt index 10210fb4..a89a44a1 100644 --- a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Specific.kt +++ b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Specific.kt @@ -1,11 +1,13 @@ package hep.dataforge.meta /** - * Marker interface for specifications + * Marker interface for classes with specifications */ -interface Specific : Configurable { - operator fun get(name: String): MetaItem? = config[name] -} +interface Specific : Configurable + +//TODO separate mutable config from immutable meta to allow free wrapping of meta + +operator fun Specific.get(name: String): MetaItem<*>? = config[name] /** * Allows to apply custom configuration in a type safe way to simple untyped configuration. @@ -29,6 +31,7 @@ interface Specification { */ fun wrap(config: Config): T + //TODO replace by free wrapper fun wrap(meta: Meta): T = wrap(meta.toConfig()) } @@ -59,4 +62,4 @@ fun > S.createStyle(action: C.() -> Unit): Me fun Specific.spec( spec: Specification, key: String? = null -) = MutableMorphDelegate(config, key) { spec.wrap(it) } \ No newline at end of file +): MutableMorphDelegate = MutableMorphDelegate(config, key) { spec.wrap(it) } \ No newline at end of file diff --git a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Styled.kt b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Styled.kt index dcc36a5a..55d652aa 100644 --- a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Styled.kt +++ b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Styled.kt @@ -11,7 +11,11 @@ import kotlin.reflect.KProperty * @param base - unchangeable base * @param style - the style */ -class Styled(val base: Meta, val style: Config = Config().empty()) : MutableMeta { +class Styled(val base: Meta, val style: Config = Config().empty()) : AbstractMutableMeta() { + override fun wrapNode(meta: Meta): Styled = Styled(meta) + + override fun empty(): Styled = Styled(EmptyMeta) + override val items: Map> get() = (base.items.keys + style.items.keys).associate { key -> val value = base.items[key] @@ -19,10 +23,10 @@ class Styled(val base: Meta, val style: Config = Config().empty()) : MutableMeta val item: MetaItem = when (value) { null -> when (styleValue) { null -> error("Should be unreachable") - is MetaItem.ValueItem -> MetaItem.ValueItem(styleValue.value) is MetaItem.NodeItem -> MetaItem.NodeItem(Styled(style.empty(), styleValue.node)) + is MetaItem.ValueItem -> styleValue } - is MetaItem.ValueItem -> MetaItem.ValueItem(value.value) + is MetaItem.ValueItem -> value is MetaItem.NodeItem -> MetaItem.NodeItem( Styled(value.node, styleValue?.node ?: Config.empty()) ) @@ -30,7 +34,7 @@ class Styled(val base: Meta, val style: Config = Config().empty()) : MutableMeta key to item } - override fun set(name: Name, item: MetaItem?) { + override fun set(name: Name, item: MetaItem<*>?) { if (item == null) { style.remove(name) } else { @@ -38,12 +42,12 @@ class Styled(val base: Meta, val style: Config = Config().empty()) : MutableMeta } } - override fun onChange(owner: Any?, action: (Name, before: MetaItem<*>?, after: MetaItem<*>?) -> Unit) { + fun onChange(owner: Any?, action: (Name, before: MetaItem<*>?, after: MetaItem<*>?) -> Unit) { //TODO test correct behavior style.onChange(owner) { name, before, after -> action(name, before ?: base[name], after ?: base[name]) } } - override fun removeListener(owner: Any?) { + fun removeListener(owner: Any?) { style.removeListener(owner) } } diff --git a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/ConfigDelegates.kt b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/configDelegates.kt similarity index 69% rename from dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/ConfigDelegates.kt rename to dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/configDelegates.kt index e848ec31..6871b6e3 100644 --- a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/ConfigDelegates.kt +++ b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/configDelegates.kt @@ -1,5 +1,6 @@ package hep.dataforge.meta +import hep.dataforge.values.DoubleArrayValue import hep.dataforge.values.Null import hep.dataforge.values.Value import kotlin.jvm.JvmName @@ -10,25 +11,24 @@ import kotlin.jvm.JvmName /** * A property delegate that uses custom key */ -fun Configurable.value(default: Any = Null, key: String? = null) = +fun Configurable.value(default: Any = Null, key: String? = null): MutableValueDelegate = MutableValueDelegate(config, key, Value.of(default)) -fun Configurable.value(default: T? = null, key: String? = null, transform: (Value?) -> T) = - MutableValueDelegate(config, key, Value.of(default)).transform(reader = transform) +fun Configurable.value( + default: T? = null, + key: String? = null, + writer: (T) -> Value = { Value.of(it) }, + reader: (Value?) -> T +): ReadWriteDelegateWrapper = + MutableValueDelegate(config, key, default?.let { Value.of(it) }).transform(reader = reader, writer = writer) -fun Configurable.stringList(key: String? = null) = - value(key) { it?.list?.map { value -> value.string } ?: emptyList() } - -fun Configurable.numberList(key: String? = null) = - value(key) { it?.list?.map { value -> value.number } ?: emptyList() } - -fun Configurable.string(default: String? = null, key: String? = null) = +fun Configurable.string(default: String? = null, key: String? = null): MutableStringDelegate = MutableStringDelegate(config, key, default) -fun Configurable.boolean(default: Boolean? = null, key: String? = null) = +fun Configurable.boolean(default: Boolean? = null, key: String? = null): MutableBooleanDelegate = MutableBooleanDelegate(config, key, default) -fun Configurable.number(default: Number? = null, key: String? = null) = +fun Configurable.number(default: Number? = null, key: String? = null): MutableNumberDelegate = MutableNumberDelegate(config, key, default) /* Number delegates*/ @@ -111,3 +111,26 @@ fun Configurable.spec(spec: Specification, key: String? = null fun Configurable.spec(builder: (Config) -> T, key: String? = null) = MutableMorphDelegate(config, key) { specification(builder).wrap(it) } + +/* + * Extra delegates for special cases + */ + +fun Configurable.stringList(key: String? = null): ReadWriteDelegateWrapper> = + value(emptyList(), key) { it?.list?.map { value -> value.string } ?: emptyList() } + +fun Configurable.numberList(key: String? = null): ReadWriteDelegateWrapper> = + value(emptyList(), key) { it?.list?.map { value -> value.number } ?: emptyList() } + +/** + * A special delegate for double arrays + */ +fun Configurable.doubleArray(key: String? = null): ReadWriteDelegateWrapper = + value(doubleArrayOf(), key) { + (it as? DoubleArrayValue)?.value + ?: it?.list?.map { value -> value.number.toDouble() }?.toDoubleArray() + ?: doubleArrayOf() + } + +fun Configurable.child(key: String? = null, converter: (Meta) -> T) = + MutableMorphDelegate(config, key, converter) diff --git a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Delegates.kt b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/metaDelegates.kt similarity index 96% rename from dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Delegates.kt rename to dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/metaDelegates.kt index 377ec994..9d606b6f 100644 --- a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/Delegates.kt +++ b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/meta/metaDelegates.kt @@ -113,8 +113,11 @@ class SafeEnumDelegate>( //Child node delegate -class ChildDelegate(val meta: Meta, private val key: String? = null, private val converter: (Meta) -> T) : - ReadOnlyProperty { +class ChildDelegate( + val meta: Meta, + private val key: String? = null, + private val converter: (Meta) -> T +) : ReadOnlyProperty { override fun getValue(thisRef: Any?, property: KProperty<*>): T? { return meta[key ?: property.name]?.node?.let { converter(it) } } @@ -164,6 +167,8 @@ inline fun > Meta.enum(default: E, key: String? = null) = SafeEnumDelegate(this, key, default) { enumValueOf(it) } +fun Metoid.child(key: String? = null, converter: (Meta) -> T) = ChildDelegate(meta, key, converter) + /* Read-write delegates */ class MutableValueDelegate>( @@ -327,7 +332,7 @@ class MutableSafeEnumvDelegate, E : Enum>( //Child node delegate -class MutableNodeDelegate>( +class MutableNodeDelegate>( val meta: M, private val key: String? = null ) : ReadWriteProperty { @@ -340,7 +345,7 @@ class MutableNodeDelegate>( } } -class MutableMorphDelegate, T : Configurable>( +class MutableMorphDelegate, T : Configurable>( val meta: M, private val key: String? = null, private val converter: (Meta) -> T @@ -390,7 +395,7 @@ fun > M.boolean(default: Boolean? = null, key: String? = null fun > M.number(default: Number? = null, key: String? = null) = MutableNumberDelegate(this, key, default) -fun > M.node(key: String? = null) = MutableNodeDelegate(this, key) +fun > M.node(key: String? = null) = MutableNodeDelegate(this, key) @JvmName("safeString") fun > M.string(default: String, key: String? = null) = @@ -418,4 +423,4 @@ fun > M.number(key: String? = null, default: () -> Number) = inline fun , reified E : Enum> M.enum(default: E, key: String? = null) = - MutableSafeEnumvDelegate(this, key, default) { enumValueOf(it) } + MutableSafeEnumvDelegate(this, key, default) { enumValueOf(it) } \ No newline at end of file diff --git a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/names/Name.kt b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/names/Name.kt index d8513e9c..070a8f75 100644 --- a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/names/Name.kt +++ b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/names/Name.kt @@ -6,10 +6,9 @@ package hep.dataforge.names * The name is a dot separated list of strings like `token1.token2.token3`. * Each token could contain additional index in square brackets. */ -inline class Name constructor(val tokens: List) { +class Name(val tokens: List) { - val length - get() = tokens.size + val length get() = tokens.size /** * First token of the name or null if it is empty @@ -35,6 +34,23 @@ inline class Name constructor(val tokens: List) { override fun toString(): String = tokens.joinToString(separator = NAME_SEPARATOR) { it.toString() } + override fun equals(other: Any?): Boolean { + return when (other) { + is Name -> this.tokens == other.tokens + is NameToken -> this.length == 1 && this.tokens.first() == other + else -> false + } + } + + override fun hashCode(): Int { + return if (tokens.size == 1) { + tokens.first().hashCode() + } else { + tokens.hashCode() + } + } + + companion object { const val NAME_SEPARATOR = "." } @@ -51,32 +67,55 @@ data class NameToken(val body: String, val index: String = "") { if (body.isEmpty()) error("Syntax error: Name token body is empty") } + private fun String.escape() = + replace("\\", "\\\\") + .replace(".", "\\.") + .replace("[", "\\[") + .replace("]", "\\]") + override fun toString(): String = if (hasIndex()) { - "$body[$index]" + "${body.escape()}[$index]" } else { - body + body.escape() } fun hasIndex() = index.isNotEmpty() } +/** + * Convert a [String] to name parsing it and extracting name tokens and index syntax. + * This operation is rather heavy so it should be used with care in high performance code. + */ fun String.toName(): Name { if (isBlank()) return EmptyName val tokens = sequence { var bodyBuilder = StringBuilder() var queryBuilder = StringBuilder() var bracketCount: Int = 0 + var escape: Boolean = false fun queryOn() = bracketCount > 0 - asSequence().forEach { - if (queryOn()) { - when (it) { - '[' -> bracketCount++ - ']' -> bracketCount-- + for (it in this@toName) { + when { + escape -> { + if (queryOn()) { + queryBuilder.append(it) + } else { + bodyBuilder.append(it) + } + escape = false + } + it == '\\' -> { + escape = true } - if (queryOn()) queryBuilder.append(it) - } else { - when (it) { + queryOn() -> { + when (it) { + '[' -> bracketCount++ + ']' -> bracketCount-- + } + if (queryOn()) queryBuilder.append(it) + } + else -> when (it) { '.' -> { yield(NameToken(bodyBuilder.toString(), queryBuilder.toString())) bodyBuilder = StringBuilder() @@ -96,6 +135,14 @@ fun String.toName(): Name { return Name(tokens.toList()) } +/** + * Convert the [String] to a [Name] by simply wrapping it in a single name token without parsing. + * The input string could contain dots and braces, but they are just escaped, not parsed. + */ +fun String.asName(): Name { + return NameToken(this).asName() +} + operator fun NameToken.plus(other: Name): Name = Name(listOf(this) + other.tokens) operator fun Name.plus(other: Name): Name = Name(this.tokens + other.tokens) @@ -121,5 +168,20 @@ fun Name.withIndex(index: String): Name { return Name(tokens) } +/** + * Fast [String]-based accessor for item map + */ +operator fun Map.get(body: String, query: String = ""): T? = get(NameToken(body, query)) + operator fun Map.get(name: String) = get(name.toName()) -operator fun MutableMap.set(name: String, value: T) = set(name.toName(), value) \ No newline at end of file +operator fun MutableMap.set(name: String, value: T) = set(name.toName(), value) + +/* Name comparison operations */ + +fun Name.startsWith(token: NameToken): Boolean = first() == token + +fun Name.endsWith(token: NameToken): Boolean = last() == token + +fun Name.startsWith(name: Name): Boolean = tokens.subList(0, name.length) == name.tokens + +fun Name.endsWith(name: Name): Boolean = tokens.subList(length - name.length, length) == name.tokens \ No newline at end of file diff --git a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/values/Value.kt b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/values/Value.kt index 20f86779..cd68e040 100644 --- a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/values/Value.kt +++ b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/values/Value.kt @@ -43,6 +43,8 @@ interface Value { val list: List get() = listOf(this) + override fun equals(other: Any?): Boolean + companion object { const val TYPE = "value" @@ -82,6 +84,8 @@ object Null : Value { override val string: String get() = "@null" override fun toString(): String = value.toString() + + override fun equals(other: Any?): Boolean = other === Null } /** @@ -97,9 +101,12 @@ object True : Value { override val value: Any? get() = true override val type: ValueType get() = ValueType.BOOLEAN override val number: Number get() = 1.0 - override val string: String get() = "+" + override val string: String get() = "true" override fun toString(): String = value.toString() + + override fun equals(other: Any?): Boolean = other === True + } /** @@ -109,7 +116,11 @@ object False : Value { override val value: Any? get() = false override val type: ValueType get() = ValueType.BOOLEAN override val number: Number get() = -1.0 - override val string: String get() = "-" + override val string: String get() = "false" + + override fun toString(): String = True.value.toString() + + override fun equals(other: Any?): Boolean = other === False } val Value.boolean get() = this == True || this.list.firstOrNull() == True || (type == ValueType.STRING && string.toBoolean()) @@ -122,12 +133,12 @@ class NumberValue(override val number: Number) : Value { override fun equals(other: Any?): Boolean { if (other !is Value) return false return when (number) { - is Short -> number == other.number.toShort() - is Long -> number == other.number.toLong() - is Byte -> number == other.number.toByte() - is Int -> number == other.number.toInt() - is Float -> number == other.number.toFloat() - is Double -> number == other.number.toDouble() + is Short -> number.toShort() == other.number.toShort() + is Long -> number.toLong() == other.number.toLong() + is Byte -> number.toByte() == other.number.toByte() + is Int -> number.toInt() == other.number.toInt() + is Float -> number.toFloat() == other.number.toFloat() + is Double -> number.toDouble() == other.number.toDouble() else -> number.toString() == other.number.toString() } } @@ -148,7 +159,7 @@ class StringValue(override val string: String) : Value { override fun hashCode(): Int = string.hashCode() - override fun toString(): String = value.toString() + override fun toString(): String = "\"${value.toString()}\"" } class EnumValue>(override val value: E) : Value { @@ -177,11 +188,14 @@ class ListValue(override val list: List) : Value { override val number: Number get() = list.first().number override val string: String get() = list.first().string - override fun toString(): String = value.toString() + override fun toString(): String = list.joinToString (prefix = "[", postfix = "]") override fun equals(other: Any?): Boolean { if (this === other) return true if (other !is Value) return false + if( other is DoubleArrayValue){ + + } return list == other.list } @@ -206,9 +220,6 @@ fun String.asValue(): Value = StringValue(this) fun Iterable.asValue(): Value = ListValue(this.toList()) -//TODO maybe optimized storage performance -fun DoubleArray.asValue(): Value = ListValue(map{NumberValue(it)}) - fun IntArray.asValue(): Value = ListValue(map{NumberValue(it)}) fun LongArray.asValue(): Value = ListValue(map{NumberValue(it)}) @@ -253,17 +264,4 @@ fun String.parseValue(): Value { //Give up and return a StringValue return StringValue(this) -} - -class LazyParsedValue(override val string: String) : Value { - private val parsedValue by lazy { string.parseValue() } - - override val value: Any? - get() = parsedValue.value - override val type: ValueType - get() = parsedValue.type - override val number: Number - get() = parsedValue.number - - override fun toString(): String = value.toString() } \ No newline at end of file diff --git a/dataforge-meta/src/commonMain/kotlin/hep/dataforge/values/exoticValues.kt b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/values/exoticValues.kt new file mode 100644 index 00000000..f6122dbb --- /dev/null +++ b/dataforge-meta/src/commonMain/kotlin/hep/dataforge/values/exoticValues.kt @@ -0,0 +1,47 @@ +package hep.dataforge.values + + +/** + * A value built from string which content and type are parsed on-demand + */ +class LazyParsedValue(override val string: String) : Value { + private val parsedValue by lazy { string.parseValue() } + + override val value: Any? get() = parsedValue.value + override val type: ValueType get() = parsedValue.type + override val number: Number get() = parsedValue.number + + override fun toString(): String = string + + override fun equals(other: Any?): Boolean = other is Value && this.parsedValue == other +} + +fun String.lazyParseValue(): LazyParsedValue = LazyParsedValue(this) + +/** + * A performance optimized version of list value for doubles + */ +class DoubleArrayValue(override val value: DoubleArray) : Value { + override val type: ValueType get() = ValueType.NUMBER + override val number: Double get() = value.first() + override val string: String get() = value.first().toString() + override val list: List get() = value.map { NumberValue(it) } + + override fun equals(other: Any?): Boolean { + if (this === other) return true + if (other !is Value) return false + + return when (other) { + is DoubleArrayValue -> value.contentEquals(other.value) + else -> list == other.list + } + } + + override fun hashCode(): Int { + return value.contentHashCode() + } + + override fun toString(): String = list.joinToString (prefix = "[", postfix = "]") +} + +fun DoubleArray.asValue(): DoubleArrayValue = DoubleArrayValue(this) diff --git a/dataforge-meta/src/commonTest/kotlin/hep/dataforge/descriptors/DescriptorTest.kt b/dataforge-meta/src/commonTest/kotlin/hep/dataforge/descriptors/DescriptorTest.kt new file mode 100644 index 00000000..79800ec5 --- /dev/null +++ b/dataforge-meta/src/commonTest/kotlin/hep/dataforge/descriptors/DescriptorTest.kt @@ -0,0 +1,31 @@ +package hep.dataforge.descriptors + +import hep.dataforge.values.ValueType +import kotlin.test.Test +import kotlin.test.assertEquals + +class DescriptorTest { + + val descriptor = NodeDescriptor.build { + node("aNode") { + info = "A root demo node" + value("b") { + info = "b number value" + type(ValueType.NUMBER) + } + node("otherNode") { + value("otherValue") { + type(ValueType.BOOLEAN) + default(false) + info = "default value" + } + } + } + } + + @Test + fun testAllowedValues() { + val allowed = descriptor.nodes["aNode"]?.values?.get("b")?.allowedValues + assertEquals(allowed, emptyList()) + } +} \ No newline at end of file diff --git a/dataforge-meta/src/commonTest/kotlin/hep/dataforge/meta/MutableMetaTest.kt b/dataforge-meta/src/commonTest/kotlin/hep/dataforge/meta/MutableMetaTest.kt new file mode 100644 index 00000000..5ab75fd4 --- /dev/null +++ b/dataforge-meta/src/commonTest/kotlin/hep/dataforge/meta/MutableMetaTest.kt @@ -0,0 +1,22 @@ +package hep.dataforge.meta + +import kotlin.test.Test +import kotlin.test.assertEquals + +class MutableMetaTest{ + @Test + fun testRemove(){ + val meta = buildMeta { + "aNode" to { + "innerNode" to { + "innerValue" to true + } + "b" to 22 + "c" to "StringValue" + } + }.toConfig() + + meta.remove("aNode.c") + assertEquals(meta["aNode.c"], null) + } +} \ No newline at end of file diff --git a/dataforge-meta/src/commonTest/kotlin/hep/dataforge/names/NameTest.kt b/dataforge-meta/src/commonTest/kotlin/hep/dataforge/names/NameTest.kt index 1302777e..bcd75154 100644 --- a/dataforge-meta/src/commonTest/kotlin/hep/dataforge/names/NameTest.kt +++ b/dataforge-meta/src/commonTest/kotlin/hep/dataforge/names/NameTest.kt @@ -2,6 +2,8 @@ package hep.dataforge.names import kotlin.test.Test import kotlin.test.assertEquals +import kotlin.test.assertFalse +import kotlin.test.assertTrue class NameTest { @Test @@ -16,4 +18,24 @@ class NameTest { val name2 = "token1".toName() + "token2[2].token3" assertEquals(name1, name2) } + + @Test + fun comparisonTest(){ + val name1 = "token1.token2.token3".toName() + val name2 = "token1.token2".toName() + val name3 = "token3".toName() + assertTrue { name1.startsWith(name2) } + assertTrue { name1.endsWith(name3) } + assertFalse { name1.startsWith(name3) } + } + + @Test + fun escapeTest(){ + val escapedName = "token\\.one.token2".toName() + val unescapedName = "token\\.one.token2".asName() + + assertEquals(2, escapedName.length) + assertEquals(1, unescapedName.length) + assertEquals(escapedName, escapedName.toString().toName()) + } } \ No newline at end of file diff --git a/dataforge-meta/src/jsMain/kotlin/hep/dataforge/meta/DynamicMeta.kt b/dataforge-meta/src/jsMain/kotlin/hep/dataforge/meta/DynamicMeta.kt index 4719831b..694e2441 100644 --- a/dataforge-meta/src/jsMain/kotlin/hep/dataforge/meta/DynamicMeta.kt +++ b/dataforge-meta/src/jsMain/kotlin/hep/dataforge/meta/DynamicMeta.kt @@ -29,15 +29,16 @@ fun Meta.toDynamic(): dynamic { return res } -class DynamicMeta(val obj: dynamic) : Meta { +class DynamicMeta(val obj: dynamic) : MetaBase() { private fun keys() = js("Object.keys(this.obj)") as Array private fun isArray(@Suppress("UNUSED_PARAMETER") obj: dynamic): Boolean = js("Array.isArray(obj)") as Boolean + @Suppress("UNCHECKED_CAST") private fun asItem(obj: dynamic): MetaItem? { if (obj == null) return MetaItem.ValueItem(Null) - return when (jsTypeOf(obj)) { + return when (jsTypeOf(obj as? Any)) { "boolean" -> MetaItem.ValueItem(Value.of(obj as Boolean)) "number" -> MetaItem.ValueItem(Value.of(obj as Number)) "string" -> MetaItem.ValueItem(Value.of(obj as String)) diff --git a/dataforge-output/dataforge-output-html/build.gradle.kts b/dataforge-output-html/build.gradle.kts similarity index 93% rename from dataforge-output/dataforge-output-html/build.gradle.kts rename to dataforge-output-html/build.gradle.kts index d1693bab..98b9d0bd 100644 --- a/dataforge-output/dataforge-output-html/build.gradle.kts +++ b/dataforge-output-html/build.gradle.kts @@ -1,12 +1,10 @@ plugins { - `npm-multiplatform` + id("scientifik.mpp") } val htmlVersion by rootProject.extra("0.6.12") kotlin { - jvm() - js() sourceSets { val commonMain by getting { dependencies { diff --git a/dataforge-output/dataforge-output-html/src/jvmMain/kotlin/hep/dataforge/output/html/HtmlOutput.kt b/dataforge-output-html/src/commonMain/kotlin/hep/dataforge/output/html/HtmlOutput.kt similarity index 91% rename from dataforge-output/dataforge-output-html/src/jvmMain/kotlin/hep/dataforge/output/html/HtmlOutput.kt rename to dataforge-output-html/src/commonMain/kotlin/hep/dataforge/output/html/HtmlOutput.kt index bfbd74eb..b54b7eb7 100644 --- a/dataforge-output/dataforge-output-html/src/jvmMain/kotlin/hep/dataforge/output/html/HtmlOutput.kt +++ b/dataforge-output-html/src/commonMain/kotlin/hep/dataforge/output/html/HtmlOutput.kt @@ -27,8 +27,8 @@ class HtmlOutput(override val context: Context, private val consumer: T } else { val value = cache[obj::class] if (value == null) { - val answer = context.top>().values - .filter { it.type.isInstance(obj) }.firstOrNull() + val answer = + context.top>(HTML_CONVERTER_TYPE).values.firstOrNull { it.type.isInstance(obj) } if (answer != null) { cache[obj::class] = answer answer @@ -40,6 +40,7 @@ class HtmlOutput(override val context: Context, private val consumer: T } } context.launch(Dispatchers.Output) { + @Suppress("UNCHECKED_CAST") (builder as HtmlBuilder).run { consumer.render(obj) } } } diff --git a/dataforge-output/build.gradle.kts b/dataforge-output/build.gradle.kts index 36811267..6c0a1e53 100644 --- a/dataforge-output/build.gradle.kts +++ b/dataforge-output/build.gradle.kts @@ -1,10 +1,8 @@ plugins { - `npm-multiplatform` + id("scientifik.mpp") } kotlin { - jvm() - js() sourceSets { val commonMain by getting{ dependencies { diff --git a/dataforge-output/src/commonMain/kotlin/hep/dataforge/output/OutputManager.kt b/dataforge-output/src/commonMain/kotlin/hep/dataforge/output/OutputManager.kt index 47d9f59c..448b3adb 100644 --- a/dataforge-output/src/commonMain/kotlin/hep/dataforge/output/OutputManager.kt +++ b/dataforge-output/src/commonMain/kotlin/hep/dataforge/output/OutputManager.kt @@ -1,6 +1,9 @@ package hep.dataforge.output -import hep.dataforge.context.* +import hep.dataforge.context.AbstractPlugin +import hep.dataforge.context.Context +import hep.dataforge.context.PluginFactory +import hep.dataforge.context.PluginTag import hep.dataforge.context.PluginTag.Companion.DATAFORGE_GROUP import hep.dataforge.meta.EmptyMeta import hep.dataforge.meta.Meta @@ -13,7 +16,7 @@ import kotlin.reflect.KClass /** * A manager for outputs */ -interface OutputManager : Plugin { +interface OutputManager { /** * Get an output specialized for given type, name and stage. diff --git a/dataforge-scripting/build.gradle.kts b/dataforge-scripting/build.gradle.kts index eb8f7742..757f0c33 100644 --- a/dataforge-scripting/build.gradle.kts +++ b/dataforge-scripting/build.gradle.kts @@ -1,5 +1,5 @@ plugins { - `npm-multiplatform` + id("scientifik.mpp") } kotlin { diff --git a/dataforge-scripting/src/commonMain/kotlin/hep/dataforge/scripting/Placeholder.kt b/dataforge-scripting/src/commonMain/kotlin/hep/dataforge/scripting/Placeholder.kt new file mode 100644 index 00000000..a09e35c1 --- /dev/null +++ b/dataforge-scripting/src/commonMain/kotlin/hep/dataforge/scripting/Placeholder.kt @@ -0,0 +1,4 @@ +package hep.dataforge.scripting + +internal object Placeholder { +} \ No newline at end of file diff --git a/dataforge-scripting/src/jvmMain/kotlin/hep/dataforge/scripting/Builders.kt b/dataforge-scripting/src/jvmMain/kotlin/hep/dataforge/scripting/Builders.kt index 31f3a508..2df5a183 100644 --- a/dataforge-scripting/src/jvmMain/kotlin/hep/dataforge/scripting/Builders.kt +++ b/dataforge-scripting/src/jvmMain/kotlin/hep/dataforge/scripting/Builders.kt @@ -8,6 +8,7 @@ import hep.dataforge.workspace.WorkspaceBuilder import java.io.File import kotlin.script.experimental.api.* import kotlin.script.experimental.host.toScriptSource +import kotlin.script.experimental.jvm.defaultJvmScriptingHostConfiguration import kotlin.script.experimental.jvm.dependenciesFromCurrentContext import kotlin.script.experimental.jvm.jvm import kotlin.script.experimental.jvmhost.BasicJvmScriptingHost @@ -24,6 +25,7 @@ object Builders { jvm { dependenciesFromCurrentContext(wholeClasspath = true) } + hostConfiguration(defaultJvmScriptingHostConfiguration) } val evaluationConfiguration = ScriptEvaluationConfiguration { diff --git a/dataforge-workspace/build.gradle.kts b/dataforge-workspace/build.gradle.kts index e6aa9dd0..c576ef8e 100644 --- a/dataforge-workspace/build.gradle.kts +++ b/dataforge-workspace/build.gradle.kts @@ -1,5 +1,5 @@ plugins { - `npm-multiplatform` + id("scientifik.mpp") } kotlin { @@ -10,6 +10,7 @@ kotlin { dependencies { api(project(":dataforge-context")) api(project(":dataforge-data")) + api(project(":dataforge-output")) } } } diff --git a/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/SimpleWorkspace.kt b/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/SimpleWorkspace.kt index c7a54f3d..68438440 100644 --- a/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/SimpleWorkspace.kt +++ b/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/SimpleWorkspace.kt @@ -2,11 +2,11 @@ package hep.dataforge.workspace import hep.dataforge.context.Context import hep.dataforge.context.Global +import hep.dataforge.context.content import hep.dataforge.data.DataNode import hep.dataforge.meta.Meta import hep.dataforge.names.Name import hep.dataforge.names.toName -import hep.dataforge.provider.top /** @@ -18,8 +18,9 @@ class SimpleWorkspace( override val targets: Map, tasks: Collection> ) : Workspace { + override val tasks: Map> by lazy { - context.top>(Task.TYPE) + tasks.associate { it.name.toName() to it } + context.content>(Task.TYPE) + tasks.associate { it.name.toName() to it } } companion object { diff --git a/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/TaskModel.kt b/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/TaskModel.kt index eaaac235..b6c4b2b6 100644 --- a/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/TaskModel.kt +++ b/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/TaskModel.kt @@ -8,6 +8,7 @@ package hep.dataforge.workspace import hep.dataforge.data.DataFilter import hep.dataforge.data.DataTree import hep.dataforge.data.DataTreeBuilder +import hep.dataforge.data.dataSequence import hep.dataforge.meta.* import hep.dataforge.names.EmptyName import hep.dataforge.names.Name @@ -51,7 +52,7 @@ data class TaskModel( */ fun TaskModel.buildInput(workspace: Workspace): DataTree { return DataTreeBuilder(Any::class).apply { - dependencies.asSequence().flatMap { it.apply(workspace).data() }.forEach { (name, data) -> + dependencies.asSequence().flatMap { it.apply(workspace).dataSequence() }.forEach { (name, data) -> //TODO add concise error on replacement this[name] = data } diff --git a/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/Workspace.kt b/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/Workspace.kt index 1945c74e..f5f0f3a6 100644 --- a/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/Workspace.kt +++ b/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/Workspace.kt @@ -3,6 +3,7 @@ package hep.dataforge.workspace import hep.dataforge.context.ContextAware import hep.dataforge.data.Data import hep.dataforge.data.DataNode +import hep.dataforge.data.dataSequence import hep.dataforge.meta.Meta import hep.dataforge.meta.MetaBuilder import hep.dataforge.meta.buildMeta @@ -29,27 +30,16 @@ interface Workspace : ContextAware, Provider { */ val tasks: Map> - override fun provideTop(target: String, name: Name): Any? { + override fun provideTop(target: String): Map { return when (target) { - "target", Meta.TYPE -> targets[name.toString()] - Task.TYPE -> tasks[name] - Data.TYPE -> data[name] - DataNode.TYPE -> data.getNode(name) - else -> null + "target", Meta.TYPE -> targets.mapKeys { it.key.toName() } + Task.TYPE -> tasks + Data.TYPE -> data.dataSequence().toMap() + //DataNode.TYPE -> data.nodes.toMap() + else -> emptyMap() } } - override fun listNames(target: String): Sequence { - return when (target) { - "target", Meta.TYPE -> targets.keys.asSequence().map { it.toName() } - Task.TYPE -> tasks.keys.asSequence().map { it } - Data.TYPE -> data.data().map { it.first } - DataNode.TYPE -> data.nodes().map { it.first } - else -> emptySequence() - } - } - - /** * Invoke a task in the workspace utilizing caching if possible */ @@ -64,15 +54,6 @@ interface Workspace : ContextAware, Provider { } } -// /** -// * Invoke a task in the workspace utilizing caching if possible -// */ -// operator fun Task.invoke(targetName: String): DataNode { -// val target = targets[targetName] ?: error("A target with name $targetName not found in ${this@Workspace}") -// context.logger.info { "Running ${this.name} on $target" } -// return invoke(target) -// } - companion object { const val TYPE = "workspace" } diff --git a/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/WorkspaceBuilder.kt b/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/WorkspaceBuilder.kt index f62753a2..999ee50c 100644 --- a/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/WorkspaceBuilder.kt +++ b/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/WorkspaceBuilder.kt @@ -8,8 +8,6 @@ import hep.dataforge.data.DataTreeBuilder import hep.dataforge.meta.* import hep.dataforge.names.Name import hep.dataforge.names.toName -import kotlinx.coroutines.CoroutineScope -import kotlinx.coroutines.GlobalScope @TaskBuildScope interface WorkspaceBuilder { @@ -26,7 +24,7 @@ interface WorkspaceBuilder { /** * Set the context for future workspcace */ -fun WorkspaceBuilder.context(name: String, block: ContextBuilder.() -> Unit = {}) { +fun WorkspaceBuilder.context(name: String = "WORKSPACE", block: ContextBuilder.() -> Unit = {}) { context = ContextBuilder(name, parentContext).apply(block).build() } @@ -36,14 +34,14 @@ fun WorkspaceBuilder.data(name: Name, data: Data) { fun WorkspaceBuilder.data(name: String, data: Data) = data(name.toName(), data) -fun WorkspaceBuilder.static(name: Name, data: Any, scope: CoroutineScope = GlobalScope, meta: Meta = EmptyMeta) = - data(name, Data.static(scope, data, meta)) +fun WorkspaceBuilder.static(name: Name, data: Any, meta: Meta = EmptyMeta) = + data(name, Data.static(data, meta)) -fun WorkspaceBuilder.static(name: Name, data: Any, scope: CoroutineScope = GlobalScope, block: MetaBuilder.() -> Unit = {}) = - data(name, Data.static(scope, data, buildMeta(block))) +fun WorkspaceBuilder.static(name: Name, data: Any, block: MetaBuilder.() -> Unit = {}) = + data(name, Data.static(data, buildMeta(block))) -fun WorkspaceBuilder.static(name: String, data: Any, scope: CoroutineScope = GlobalScope, block: MetaBuilder.() -> Unit = {}) = - data(name, Data.static(scope, data, buildMeta(block))) +fun WorkspaceBuilder.static(name: String, data: Any, block: MetaBuilder.() -> Unit = {}) = + data(name, Data.static(data, buildMeta(block))) fun WorkspaceBuilder.data(name: Name, node: DataNode) { this.data[name] = node diff --git a/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/WorkspacePlugin.kt b/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/WorkspacePlugin.kt new file mode 100644 index 00000000..2fefd4f8 --- /dev/null +++ b/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/WorkspacePlugin.kt @@ -0,0 +1,19 @@ +package hep.dataforge.workspace + +import hep.dataforge.context.AbstractPlugin +import hep.dataforge.names.Name +import hep.dataforge.names.toName + +/** + * An abstract plugin with some additional boilerplate to effectively work with workspace context + */ +abstract class WorkspacePlugin : AbstractPlugin() { + abstract val tasks: Collection> + + override fun provideTop(target: String): Map { + return when(target){ + Task.TYPE -> tasks.associateBy { it.name.toName() } + else -> emptyMap() + } + } +} \ No newline at end of file diff --git a/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/dataUtils.kt b/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/dataUtils.kt new file mode 100644 index 00000000..f6d27774 --- /dev/null +++ b/dataforge-workspace/src/commonMain/kotlin/hep/dataforge/workspace/dataUtils.kt @@ -0,0 +1,14 @@ +package hep.dataforge.workspace + +import hep.dataforge.data.Data +import hep.dataforge.io.Envelope +import hep.dataforge.io.IOFormat +import hep.dataforge.io.readWith +import kotlin.reflect.KClass + +/** + * Convert an [Envelope] to a data via given format. The actual parsing is done lazily. + */ +fun Envelope.toData(type: KClass, format: IOFormat): Data = Data(type, meta) { + data?.readWith(format) ?: error("Can't convert envelope without data to Data") +} \ No newline at end of file diff --git a/dataforge-workspace/src/jvmMain/kotlin/hep/dataforge/workspace/TaskBuilder.kt b/dataforge-workspace/src/jvmMain/kotlin/hep/dataforge/workspace/TaskBuilder.kt index 761dc8f3..175c4d87 100644 --- a/dataforge-workspace/src/jvmMain/kotlin/hep/dataforge/workspace/TaskBuilder.kt +++ b/dataforge-workspace/src/jvmMain/kotlin/hep/dataforge/workspace/TaskBuilder.kt @@ -27,7 +27,7 @@ class TaskBuilder(val name: String) { val localData = if (from.isEmpty()) { node } else { - node.getNode(from.toName()) ?: return null + node[from.toName()].node ?: return null } return transform(workspace.context, model, localData) } @@ -207,7 +207,7 @@ class TaskBuilder(val name: String) { } } -fun task(name: String, builder: TaskBuilder.() -> Unit): GenericTask { +fun Workspace.Companion.task(name: String, builder: TaskBuilder.() -> Unit): GenericTask { return TaskBuilder(name).apply(builder).build() } diff --git a/dataforge-workspace/src/jvmMain/kotlin/hep/dataforge/workspace/fileData.kt b/dataforge-workspace/src/jvmMain/kotlin/hep/dataforge/workspace/fileData.kt new file mode 100644 index 00000000..f20e18cf --- /dev/null +++ b/dataforge-workspace/src/jvmMain/kotlin/hep/dataforge/workspace/fileData.kt @@ -0,0 +1,92 @@ +package hep.dataforge.workspace + +import hep.dataforge.data.Data +import hep.dataforge.descriptors.NodeDescriptor +import hep.dataforge.io.* +import hep.dataforge.meta.EmptyMeta +import hep.dataforge.meta.Meta +import kotlinx.coroutines.Dispatchers +import kotlinx.coroutines.coroutineScope +import kotlinx.coroutines.withContext +import kotlinx.io.nio.asInput +import kotlinx.io.nio.asOutput +import java.nio.file.Files +import java.nio.file.Path +import java.nio.file.StandardOpenOption +import kotlin.reflect.KClass + +/** + * Read meta from file in a given [format] + */ +suspend fun Path.readMeta(format: MetaFormat, descriptor: NodeDescriptor? = null): Meta { + return withContext(Dispatchers.IO) { + format.run { + Files.newByteChannel(this@readMeta, StandardOpenOption.READ) + .asInput() + .readMeta(descriptor) + } + } +} + +/** + * Write meta to file in a given [format] + */ +suspend fun Meta.write(path: Path, format: MetaFormat, descriptor: NodeDescriptor? = null) { + withContext(Dispatchers.IO) { + format.run { + Files.newByteChannel(path, StandardOpenOption.WRITE, StandardOpenOption.CREATE_NEW) + .asOutput() + .writeMeta(this@write, descriptor) + } + } +} + +/** + * Read data with supported envelope format and binary format. If envelope format is null, then read binary directly from file. + * @param type explicit type of data read + * @param format binary format + * @param envelopeFormat the format of envelope. If null, file is read directly + * @param metaFile the relative file for optional meta override + * @param metaFileFormat the meta format for override + */ +suspend fun Path.readData( + type: KClass, + format: IOFormat, + envelopeFormat: EnvelopeFormat? = null, + metaFile: Path = resolveSibling("$fileName.meta"), + metaFileFormat: MetaFormat = JsonMetaFormat +): Data { + return coroutineScope { + val externalMeta = if (Files.exists(metaFile)) { + metaFile.readMeta(metaFileFormat) + } else { + null + } + if (envelopeFormat == null) { + Data(type, externalMeta ?: EmptyMeta) { + withContext(Dispatchers.IO) { + format.run { + Files.newByteChannel(this@readData, StandardOpenOption.READ) + .asInput() + .readThis() + } + } + } + } else { + withContext(Dispatchers.IO) { + readEnvelope(envelopeFormat).let { + if (externalMeta == null) { + it + } else { + it.withMetaLayers(externalMeta) + } + }.toData(type, format) + } + } + } +} + +//suspend fun Path.writeData( +// data: Data, +// format: IOFormat, +// ) \ No newline at end of file diff --git a/dataforge-workspace/src/jvmTest/kotlin/hep/dataforge/workspace/SimpleWorkspaceTest.kt b/dataforge-workspace/src/jvmTest/kotlin/hep/dataforge/workspace/SimpleWorkspaceTest.kt index 52ba85b5..51471525 100644 --- a/dataforge-workspace/src/jvmTest/kotlin/hep/dataforge/workspace/SimpleWorkspaceTest.kt +++ b/dataforge-workspace/src/jvmTest/kotlin/hep/dataforge/workspace/SimpleWorkspaceTest.kt @@ -1,14 +1,33 @@ package hep.dataforge.workspace +import hep.dataforge.context.PluginTag import hep.dataforge.data.first import hep.dataforge.data.get +import hep.dataforge.meta.boolean +import hep.dataforge.meta.get import org.junit.Test import kotlin.test.assertEquals +import kotlin.test.assertTrue class SimpleWorkspaceTest { + val testPlugin = object : WorkspacePlugin() { + override val tag: PluginTag = PluginTag("test") + + val contextTask = Workspace.task("test") { + pipe { + context.logger.info { "Test: $it" } + } + } + override val tasks: Collection> = listOf(contextTask) + } + val workspace = SimpleWorkspace.build { + context { + plugin(testPlugin) + } + repeat(100) { static("myData[$it]", it) } @@ -19,6 +38,9 @@ class SimpleWorkspaceTest { allData() } pipe { data -> + if (meta["testFlag"].boolean == true) { + println("flag") + } context.logger.info { "Starting square on $data" } data * data } @@ -39,13 +61,13 @@ class SimpleWorkspaceTest { allData() } joinByGroup { context -> - group("even", filter = { name, data -> name.toString().toInt() % 2 == 0 }) { + group("even", filter = { name, _ -> name.toString().toInt() % 2 == 0 }) { result { data -> context.logger.info { "Starting even" } data.values.average() } } - group("odd", filter = { name, data -> name.toString().toInt() % 2 == 1 }) { + group("odd", filter = { name, _ -> name.toString().toInt() % 2 == 1 }) { result { data -> context.logger.info { "Starting odd" } data.values.average() @@ -54,20 +76,35 @@ class SimpleWorkspaceTest { } } - task("delta"){ - model{ + task("delta") { + model { dependsOn("average") } - join {data-> + join { data -> data["even"]!! - data["odd"]!! } } + + target("empty") {} } @Test fun testWorkspace() { val node = workspace.run("sum") val res = node.first() - assertEquals(328350, res.get()) + assertEquals(328350, res?.get()) + } + + @Test + fun testMetaPropagation() { + val node = workspace.run("sum") { "testFlag" to true } + val res = node.first()?.get() + } + + @Test + fun testPluginTask() { + val tasks = workspace.tasks + assertTrue { tasks["test.test"] != null } + //val node = workspace.run("test.test", "empty") } } \ No newline at end of file diff --git a/settings.gradle.kts b/settings.gradle.kts index 03b9bff9..011123b9 100644 --- a/settings.gradle.kts +++ b/settings.gradle.kts @@ -1,16 +1,18 @@ pluginManagement { repositories { + mavenLocal() jcenter() gradlePluginPortal() maven("https://dl.bintray.com/kotlin/kotlin-eap") + maven("https://dl.bintray.com/kotlin/kotlinx") + maven("https://dl.bintray.com/mipt-npm/scientifik") + maven("https://dl.bintray.com/mipt-npm/dev") } resolutionStrategy { eachPlugin { when (requested.id.id) { "kotlinx-atomicfu" -> useModule("org.jetbrains.kotlinx:atomicfu-gradle-plugin:${requested.version}") - "kotlin-multiplatform" -> useModule("org.jetbrains.kotlin:kotlin-gradle-plugin:${requested.version}") - "kotlin2js" -> useModule("org.jetbrains.kotlin:kotlin-gradle-plugin:${requested.version}") - "org.jetbrains.kotlin.frontend" -> useModule("org.jetbrains.kotlin:kotlin-frontend-plugin:0.0.45") + "scientifik.mpp", "scientifik.publish" -> useModule("scientifik:gradle-tools:${requested.version}") } } } @@ -25,7 +27,7 @@ include( ":dataforge-context", ":dataforge-data", ":dataforge-output", - ":dataforge-output:dataforge-output-html", + ":dataforge-output-html", ":dataforge-workspace", ":dataforge-scripting" ) \ No newline at end of file