You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The codebase is written in CoffeeScript but compiled in the npm module so CoffeeScript is not a dependency in production.
11
+
The codebase is written in ES6 JavaScript but compiled in the npm module to pure JavaScript.
12
12
13
13
To get started, simply install the module using npm:
14
14
15
-
npm install node-dbf
15
+
```bash
16
+
$ npm install node-dbf
17
+
```
16
18
17
-
and then `require` it:
19
+
and then `import` it:
18
20
19
-
var Parser = require('node-dbf');
21
+
```js
22
+
importParserfrom'node-dbf';
23
+
```
20
24
21
-
#Classes
25
+
#Classes
22
26
23
27
There are two classes - the `Parser` and the `Header`. The `Parser` is the most interesting class.
24
28
25
-
##Parser
29
+
##Parser
26
30
27
31
This class is the main interface for reading data from dBase files. It extends `EventEmitter` and its output is via events.
28
32
29
-
###new Parser(path, options)
33
+
###new Parser(path, options)
30
34
31
35
* path `String` The full path to the DBF file to parse
32
36
* options `Object` An object containing options for the parser.
@@ -37,36 +41,36 @@ The support options are:
37
41
38
42
Creates a new Parser and attaches it to the specified filename.
39
43
40
-
var Parser = require('node-dbf');
44
+
import Parser from 'node-dbf';
41
45
42
-
var parser = new Parser('/path/to/my/dbase/file.dbf');
46
+
let parser = new Parser('/path/to/my/dbase/file.dbf');
43
47
44
-
###parser.on(event, listener)
48
+
###parser.on(event, listener)
45
49
46
50
* event `String` The event name to listen for (see below for details)
47
51
* listener `Function` The callback to bind to the event
48
52
49
53
This method is inherited from the `EventEmitter` class.
50
54
51
-
###parser.parse()
55
+
###parser.parse()
52
56
53
57
Call this method once you have bound to the events you are interested in. Although it returns the parser object (for chaining), all the dBase data is outputted via events.
54
58
55
59
parser.parse();
56
60
57
-
###Event: 'start'
61
+
###Event: 'start'
58
62
59
63
* parser `Parser` The parser object
60
64
61
65
This event is emitted as soon as the `parser.parse()` method has been invoked.
62
66
63
-
###Event: 'header'
67
+
###Event: 'header'
64
68
65
69
* header `Header` The header object as parsed from the dBase file
66
70
67
71
This event is emitted once the header has been parsed from the dBase file
68
72
69
-
###Event: 'record'
73
+
###Event: 'record'
70
74
71
75
* record `Object` An object representing the record that has been found
72
76
@@ -78,65 +82,74 @@ In addition to the fields, the object contains two special keys:
78
82
*@deleted`Boolean` whether this record has been deleted or not
79
83
80
84
This object may look like:
85
+
```json
86
+
{
87
+
"@sequenceNumber": 123,
88
+
"@deleted": false,
89
+
"firstName": "John",
90
+
"lastName": "Smith"
91
+
}
92
+
```
81
93
82
-
{
83
-
"@sequenceNumber": 123,
84
-
"@deleted": false,
85
-
"firstName": "John",
86
-
"lastName": "Smith
87
-
}
88
-
89
-
###Event: 'end'
94
+
### Event: 'end'
90
95
91
96
* parser `Parser` The parser object
92
97
93
98
This event is fired once the dBase parsing is complete and there are no more records remaining.
94
99
95
-
##Usage
100
+
##Usage
96
101
97
102
The following code example illustrates a very simple usage for this module:
98
103
99
-
var Parser = require('node-dbf');
100
-
101
-
var parser = new Parser('/path/to/my/dbase/file.dbf');
102
-
103
-
parser.on('start', function(p) {
104
-
console.log('dBase file parsing has started');
105
-
});
106
-
107
-
parser.on('header', function(h) {
108
-
console.log('dBase file header has been parsed');
109
-
});
110
-
111
-
parser.on('record', function(record) {
112
-
console.log('Name: ' + record.firstName + ' ' + record.lastName); // Name: John Smith
113
-
});
114
-
115
-
parser.on('end', function(p) {
116
-
console.log('Finished parsing the dBase file');
117
-
});
118
-
119
-
parser.parse();
104
+
```js
105
+
importParserfrom'node-dbf';
106
+
107
+
let parser =newParser('/path/to/my/dbase/file.dbf');
108
+
109
+
parser.on('start', (p) => {
110
+
console.log('dBase file parsing has started');
111
+
});
112
+
113
+
parser.on('header', (h) => {
114
+
console.log('dBase file header has been parsed');
115
+
});
116
+
117
+
parser.on('record', (record) => {
118
+
console.log('Name: '+record.firstName+''+record.lastName); // Name: John Smith
119
+
});
120
+
121
+
parser.on('end', (p) => {
122
+
console.log('Finished parsing the dBase file');
123
+
});
124
+
125
+
parser.parse();
126
+
```
120
127
121
-
#Command-Line Interface (CLI)
128
+
#Command-Line Interface (CLI)
122
129
123
130
The parser also supports a command-line interface (CLI) for converting DBF files to CSV. You can invoke it as follows:
124
131
125
-
$ node-dbf convert /path/to/file.dbf
132
+
```bash
133
+
$ node-dbf convert /path/to/file.dbf
134
+
```
126
135
127
136
This will write the converted rows to `stdout` and metadata about the process (e.g. number of rows, etc) to `stderr`. This allows you to write stdout directly to an output file, for example:
128
137
129
-
$ node-dbf convert file.dbf > file.csv
138
+
```bash
139
+
$ node-dbf convert file.dbf > file.csv
140
+
```
130
141
131
142
For more help information on using the command line options, use the integrated help:
132
143
133
-
$ node-dbf help
144
+
```bash
145
+
$ node-dbf help
146
+
```
134
147
135
-
#Tests
148
+
#Tests
136
149
137
150
Tests are written in Mocha using Chai BDD for the expectations. Data on San Francisco zip codes was used as a reference test file - downloaded from [SF OpenData](https://data.sfgov.org/) and included in the `./test/fixtures/bayarea_zipcodes.dbf` file within the repository.
138
151
139
-
#TODO
152
+
#To Do
140
153
141
154
* Add more tests
142
155
* Add support for field types other than Character and Numeric
0 commit comments