Skip to content

Commit

Permalink
Corrected runtime bug and redundancy code in 'add_previous_characters…
Browse files Browse the repository at this point in the history
…()' ^Cf 'i18n_lexer' crate. Completed implementation of 'Message' crate, and corrected all tests and examples affected by changes done.
  • Loading branch information
rizzen-yazston committed Mar 29, 2023
1 parent b27f13b commit afa0d1c
Show file tree
Hide file tree
Showing 24 changed files with 872 additions and 419 deletions.
18 changes: 16 additions & 2 deletions CHANGELOG.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,13 @@ Rizzen Yazston

== i18n 0.6.0 (2023-03-??)

WARNING: This update has many API breaking changes for many `i18n` crates.
WARNING: This update has API breaking changes for some `i18n` crates.

Breaking change is the result of changing how ICU data providers are used and passed to various components, thus many examples are affected even if the module is not affected by the ICU data provider change.

* Added the `icu` crate:

* Added `IcuDataProvider`, `DataProviderWrapper`, and `IcuError`.
** Added `IcuDataProvider`, `DataProviderWrapper`, and `IcuError`.

** Added the `Cargo.toml`, license, and documentation.

Expand All @@ -23,15 +23,29 @@ Breaking change is the result of changing how ICU data providers are used and pa

** Added `LStringProviderSqlite3`, `AsLStringProviderSqlite3`, and its blanket implementation.

** Removed the requirement of `RefCell` for `language_tag_registry` parameter and struct, as it was redundant.

** Updated tests, examples and documentation.

* Updated the `i18n_lexer` crate:

** Made `Lexer` struct private, made both `tokenise()` and `add_previous_characters()` methods as normal functions, removed `try_new` method and added `&Rc<IcuDataProvider>` to `tokenise()` function parameters.

** Removed the `error.rs` as neither of the functions returns errors.

** Updated the `lib.rs` to remove `error` module.

** Updated `Cargo.toml`, tests, examples and documentation.

* Updated the `i18n_pattern` crate:

** Updated `Formatter` to use `IcuDataProvider`

** Updated `Cargo.toml`, tests, examples and documentation.

* Updated `i18n_lstring` crate:

** Added `Clone` to `#[Derive()]` to allow cloning.

* Added the `i18n_message` crate:

Expand Down
12 changes: 6 additions & 6 deletions crates/lexer/README.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -66,19 +66,19 @@ features = [ "serde" ]

```
use i18n_icu::IcuDataProvider;
use i18n_lexer::{Token, TokenType, Lexer};
use i18n_lexer::{Token, TokenType, tokenise};
use icu_testdata::buffer;
use icu_provider::serde::AsDeserializingBufferProvider;
use std::rc::Rc;
use std::error::Error;

fn tokenise() -> Result<(), Box<dyn Error>> {
fn test_tokenise() -> Result<(), Box<dyn Error>> {
let buffer_provider = buffer();
let data_provider = buffer_provider.as_deserializing();
let icu_data_provider = IcuDataProvider::try_new( &data_provider )?;
let mut lexer = Lexer::try_new( &Rc::new( icu_data_provider ) )?;
let tokens = lexer.tokenise(
"String contains a {placeholder}.", &vec![ '{', '}' ]
let tokens = tokenise(
"String contains a {placeholder}.",
&vec![ '{', '}' ],
&Rc::new( icu_data_provider ),
);
let mut grammar = 0;
assert_eq!( tokens.0.iter().count(), 10, "Supposed to be a total of 10 tokens." );
Expand Down
40 changes: 0 additions & 40 deletions crates/lexer/src/error.rs

This file was deleted.

Loading

0 comments on commit afa0d1c

Please sign in to comment.