-
-
Notifications
You must be signed in to change notification settings - Fork 2
AsObjects
AsObjects
converts a token stream of arrays representing individual rows into a token stream of objects using the first row as a list of field names.
AsObjects
is based on Transform. It operates in object mode transforming a token stream produced by a parser or filters.
const {asObjects} = require('stream-csv-as-json/AsObjects');
chain([
fs.createReadStream('data.csv.gz'),
zlib.createGunzip(),
// data:
// a,b,c
// 1,2,3
parser(),
// ['a', 'b', 'c']
// ['1', '2', '3']
asObjects()
// {a: '1', b: '2', c: '3'}
]);
AsObjects
has no special API. It uses common stream options and understands the following custom flags (derived from Stringer and stream-json
's Parser):
- Packing options control packing keys. They have no default values.
-
packValues
serves as the initial value for packing keys. It is here for consistency withstream-json
. -
packKeys
specifies, if we need to pack keys and send them as a value. - More details in the section below.
-
- Streaming options control sending unpacked keys. They have no default values. It is here mostly for consistency with
stream-json
.-
streamValues
serves as the initial value for streaming keys. It is here for consistency withstream-json
. -
streamKeys
specifies, if we need to send items related to unpacked keys. - More details in the section below.
-
- Selecting key values:
-
useValues
serves as the initial value for selecting packed or streamed values of strings. It is here for consistency withstream-json
. -
useStringValues
specifies, if we need to use packed strings or streamed ones.
-
These flags behaves like in corresponding classes they were borrowed from.
Additionally it recognizes a following option:
-
fieldPrefix
is a string, which is used to generate missing field names. Default:'field'
.- If a field name is empty or missing, e.g., 5 field names are specified in the first row, but some rows have more than 5 values, then an artificial field name is generated by concatenating
fieldPrefix
value and a zero-based field index.
- If a field name is empty or missing, e.g., 5 field names are specified in the first row, but some rows have more than 5 values, then an artificial field name is generated by concatenating
make()
and asObjects()
are two aliases of the factory function. It takes options described above, and return a new instance of AsObjects
. asObjects()
helps to reduce a boilerplate when creating data processing pipelines:
const {chain} = require('stream-chain');
const {parser} = require('stream-csv-as-json/Parser');
const {asObjects} = require('stream-csv-as-json/AsObjects');
const fs = require('fs');
const pipeline = chain([
fs.createReadStream('sample.csv'),
parser(),
asObjects()
]);
let rowCounter = 0;
pipeline.on('data', data => data.name === 'startObject' && ++rowCounter);
pipeline.on('end', () => console.log(`Found ${rowCounter} rows.`));
Constructor property of make()
(and asObjects()
) is set to AsObjects
. It can be used for indirect creating of filters or metaprogramming if needed.
withParser()
takes one argument:
-
options
is an object described in Parser's options. It is used to initialize both streams (aParser
instance and a stream returned bymake()
).
It returns a stream produced by stream-chain, which wraps the pipeline. The most important utility of withParser()
is that it correctly sets object modes of the returned stream: object mode for the Readable part and text mode for the Writable part.
This static method is created using withParser() utility. It simplifies a case when a stream should be immediately preceded by a parser.
const AsObjects = require('stream-csv-as-json/AsObjects');
const fs = require('fs');
const pipeline = fs.createReadStream('sample.csv')
.pipe(AsObjects.withParser());
let rowCounter = 0;
pipeline.on('data', data => data.name === 'startObject' && ++rowCounter);
pipeline.on('end', () => console.log(`Found ${rowCounter} rows.`));