-
-
Notifications
You must be signed in to change notification settings - Fork 12
Data
- Summary
- Configuration
- Data Provider
- Data Context
- Data Registration
- Data Mappings
- Auditing
- Soft Delete
- Triggers
- Caching
- Migration
- Health Checks
- Injected Services
Data access management in Nano.
That includes database creation, transaction management and object-relation-mapping.
Nano uses Entity Framework Core for managing object relation mapping, and handling data contexts within the application. The main parts of data in Nano, evolves around the data provider and the data context.
The IDataProvider
is registered during startup, and the implementing type defines the data provider used in the application.
The DbContext
of Entity Framework is used in Nano, by the inheriting abstract class BaseDbContext
. Furthermore, the class DefaultDbContext
derives from BaseDbContext
, and your custom data context implementation should derive from that. Both Nano data context implementations, overrides certain aspects of Entity Framework, in order to extends it's functionality and to circumvent missing features.
Besides the above, the data context and the data provider must be initialized during application startup, and models mapped to corresponding data mapping implementations.
When starting the application, the database of the data context will be (if enabled). Additionally, any pending migrations will in-turn be applied to the database.
The Data
section of the configuration defines the behavior related of data access.
The section is serialized into an instance of DataOptions
, and injected as dependency during startup, thus available for injection throughout the application.
See App Settings - Data for details about the section and the meaning of the variables.
"Data": {
"BatchSize": "25",
"BulkBatchSize": "500",
"BulkBatchDelay": "1000",
"QueryRetryCount": 0,
"QueryIncludeDepth": "4",
"UseAudit": true,
"UseAutoSave": false,
"UseMemoryCache": true,
"UseLazyLoading": true,
"UseCreateDatabase": false,
"UseMigrateDatabase": true,
"UseSoftDeletetion": true,
"UseSensitiveDataLogging": false,
"UseConnectionPooling": false,
"UseHealthCheck": true,
"UnhealthyStatus": "Unhealthy",
"DefaultCollation": null,
"ConnectionString": "Server={host};Database={database};Uid={user};Pwd={password}",
"MemoryCache": {
"MaxEntries": 5000,
"ExpirationTimeoutInSeconds": 300,
"ExpirationScanFrequencyInSeconds": 60,
"ExpirationMode": "Sliding",
"IgnoredTableNames": [
]
}
}
Nano provides several data providers (and more added on request), so usually there is no need to implement a custom data provider for your application.
Data providers implements the interface IDataProvider
. It contains a single method Configure(...)
, that is responsible for handling any configuration and setup required for the data provider.
The data providers currently supported by Nano, can be referenced in the Appendix - Supported Providers.
Create the data context implementation, by deriving a class from DefaultDbContext
.
Later, associations between models and their mappings will be declared in the method OnModelCreating(...)
. Remember to invoke base class method!
// Deriving from default db-context.
public class MyDbContext : DefaultDbContext
{
public MyDbContext(DbContextOptions options)
: base(options)
{ }
protected override void OnModelCreating(ModelBuilder builder)
{
base.OnModelCreating(builder);
}
}
// Deriving from base db-context, specifying custom identity type.
public class MyDbContext : BaseDbContext<string>
{
public MyDbContext(DbContextOptions options)
: base(options)
{ }
protected override void OnModelCreating(ModelBuilder builder)
{
base.OnModelCreating(builder);
}
}
The data context and data provider must be registered as dependencies.
Invoke the method .AddDataContext<TProvider, TContext>()
, using the data context and data provider implementations as generic type parameters.
By default, the BaseDbContext
dependency is registered to resolve to DefaultDbContext
. When registering a custom data context implementation, the registration is mitigated, and both base classes will resolve to the TContext
generic type parameter implementation.
// With default Guid for Identity type.
.ConfigureServices(x =>
{
x.AddDataContext<MySqlProvider, MyDbContext>();
})
// with string as custom type for Identity type.
.ConfigureServices(x =>
{
x.AddDataContext<MySqlProvider, MyDbContext, string>();
})
By default, DbContext
resolves to NullDbContext
, which as the name indicates doesn't do anything, except ensuring the dependency will resolve. Registering an data dependency, will override the NullDbContext
with the configured data provider implementation.
The models in the application, needs to be mapped and known to the data context.
When having both the model and the mapping, those needs to be associated. In the overridden method OnModelCreating(...)
of the data context implementation, the method .AddMapping<MyEntity, MyEntityMapping>()
to be invoked for each model / mappings pair in the application.
The details about model mappings is covered in Models - Data Mappings.
protected override void OnModelCreating(ModelBuilder builder)
{
base.OnModelCreating(builder);
builder
.AddMapping<MyEntity, MyEntityMapping>();
}
When audit is enabled in the data section of the configuration, changes to all entities deriving from IEntity
will be tracked. Auditting happens automatically and is executed asynchronously, to avoid impacting the response time.
Two additional tables are created in the database for audit data storage.
__EFAudit
-
__EFAuditProperties
The tables are created even whenUseAudit
is disabled, but they will remain empty. This is in order to be able to switch at a later time, without having to worry about database migrations. The frist, contains a row for each change made to an entity, while the second stores one row for each property of the entity.
To include a model for audit, just implement the empty interface IEntityAuditable
. Though normally this isn't required explicitly as all models is included when deriving from DefaultEntity
and auditing is enabled for the application.
To exclude models implement the IEntityAuditableNegated
or ExcludeAttribute
The audit implementation is based on the EntityFramework Plus project.
In order for soft deletion to be enabled, it much be enabled in the data section of the configuration.
When implementing the interface IEntityDeletableSoft
, or when deriving a model implementation from the DefaultEntity
, the entity will be soft deleted when removed from the data context. When soft deleted, the data doesn't get removed, but the row gets flagged as deleted, and filtered out in future queries.
When dealing with soft deleted entities, together with unique indexes, a conflict can arise having one or more deleted entities with duplicate unique values. Nano automatically adjusts unique indexes, appending the IsDeleted
property. This is with the exception of the property defined as primary key.
Opposite of using regular delete, soft-deleting entities doesn't support cascading deletes.
Nano supports mapping triggers for the models. Note, these triggers are executed on code-level, as part of saving changes to the context of Entity Framework, and not to be confused with database triggers. The implementation is based on the EntityFramework.Triggers library, recommended by Microsoft.
The triggers are defined as part of the mapping implementation, and may be associated with either insert, update and delete. Furthermore, the action can be invoked either before or after the triggering action.
public class MyEntityMapping : DefaultEntityMapping<MyEntity>
{
public override void Map(EntityTypeBuilder<MyEntity> builder)
{
base.Map(builder);
builder
.OnUpdated(entry =>
{
var dbContext = entry.Context;
var entity = entry.Entity;
// action implementation.
});
}
}
Nano supports the following triggers.
- On Inserted
- On Inserting
- On Updated
- On Updating
- On Deleted
- On Deleting
For more details about triggers and how to use them, consult the docucmenation of EntityFramework.Triggers.
Simple memory caching can be enabled, by setting Data.UseMemoryCache = true
in the configuration. The cache stores queries once executed, for future invocations.
Entity framework's support for database migration is made easy with Nano.
Creating migration scripts at design-time, which in-turn are applied when running the application, can be accomplished simply by including an implementation of BaseDbContextFactory<TProvider, TContext>
. It's similar to the registration done in startup, and no additional implementation is needed.
public class MyDbContextFactory : BaseDbContextFactory<MySqlProvider, MyDbContext>
{
}
Next, open NuGet PM console from VS -> Tools -> NuGet Package Manager -> Package Manager Console, and run the following command (replace parameters).
PM> Add-Migration -name {name} -StartUpProject {project}
Name is the name of the migration, project is the project where the DbContext
implementation is located and where the migration script will be saved to. Last, the environment is the configuration to use, and it's important the connection-string in the settings file of the environment is valid, otherwise an error occurs and the migration fails.
When enabling health-checks in the data section of the confiugration, the application will be configured with a health-check for the data provider. When the application starts, a check is made to ensure that the data provider is up and running, returning a healthy status code when checked.
The health status of the application, including the data provider, can be found here:
http://{host}:{port}/healthz
When building the data provider, dependencies related to data is configured and initialized.
Nano.Data.DataOptions
Nano.Data.BaseDbContext
Nano.Data.BaseDbContextFactory
Nano.Data.DefaultDbContext
Nano.Data.DefaultDbContextFactory
Nano.Data.Interfaces.IDataProvider
For a full list of services and dependencies, see Injected Dependencies