kaue / jsonexport Goto Github PK
View Code? Open in Web Editor NEW{} → :page_facing_up: it's easy to convert JSON to CSV
Home Page: http://kaue.github.io/jsonexport/
License: Apache License 2.0
{} → :page_facing_up: it's easy to convert JSON to CSV
Home Page: http://kaue.github.io/jsonexport/
License: Apache License 2.0
Hi there
Really like how easy your library is to use :-)
I have a JSON array of objects , where the objects may or may not have all of the fields in the header.
What I am seeing in the output is that missing fields in the "middle" of the header correctly have ",,," but I don't think that trailing commas are being added for fields missing at the end of a row.
So I have some rows that have all 20 cols, but others that only have commas for 15 due to fields towards the end of the header.
Is that the expected behaviour?
Many thanks in advance
bitHound discovered 2 LintIssues in tests/options.js. For details go to bitHound.
This commit (26513e7) breaks calls that omit user options. (moving options
from module to function scope throws a ReferenceError)
I installed your module, and for the most part it is working great. I am pulling data from a database and then feeding the result (in a "simple array" format) into your module to get a "csv" output. However, when I went back and checked the resulting output, I noticed that my data did not match the order from the database.
More specifically, while all "NULL" values were converted to empty strings (it appears, anyway), the cells that contained "NULL" values were being eliminated from the proper order and being shifted to the end of that column's "csv" results.
Data that is ordered like this in the database:
id name status dateAdded count
1 Bob NULL 1-1-17 NULL
2 Martha NULL 5-21-17 3
3 Paul fired 10-3-17 18
Comes out like this in the csv results:
id,name,status,dateAdded, count
1,Bob,fired,1-1-17 ,3
2,Martha,,5-21-17,18
3,Paul,,10-3-17,
I have no options set, and am essentially running the function like this:
jsonexport(myjsondata, (err, csv) => {
if(err) return console.log(err)
return csv
} )
Can anyone help me figure out why this is happening? It's throwing my row data off and making the results unusable to send to clients or other co-workers. Thank you!
bitHound discovered 12 LintIssues, and 2 ComplexFunctions in lib/index.js. For details go to bitHound.
Currently we are skipping all handlers tests, we should update these tests to match the new API https://github.com/kauegimenes/jsonexport/blob/master/tests/options.js#L185
Also we should consider adding more examples on how to use typeHandlers, check #61
I see in JSON complex example, CSV format is as below. It doesn't look like CSV format. And how can this be converted to back to JSON. I tried different CSV to JSON module but none of them work.
Actually, I like this format, it's looks easier for language writer open in excel and see all keys in one columns and values in next but can't convert back to JSON.
cars,12
roads,5
traffic,slow
speed.max,123
speed.avg,20
speed.min,5
size,10;20
Here's my code
var test = [
{
"name" : "John",
"clases" : [
{
"year" : 2016,
"grade" : "1st",
"payment":[
{
"type":"monthly",
"status":"done"
},
{
"type":"yearly",
"status":"done"
}
]
},
{
"year" : 2017,
"grade" : "2nd",
"payment":[
{
"type":"monthly",
"status":"in progress"
},
{
"type":"yearly",
"status":"done"
}
]
}
],
},
{
"name" : "Andrew",
"clases" : [
{
"year" : 2017,
"grade" : "1nd",
"payment":[
{
"type":"monthly",
"status":"in progress"
},
{
"type":"yearly",
"status":"in progress"
}
]
}
]
},
]
jsonexport(test,function(err, csv){
if(err) return console.log(err);
console.log(csv);
});
and here is the output
name,clases.year,clases.grade,clases.payment.type,clases.payment.status
John,2016,1st,monthly,done
,2017,2nd,yearly,done
,,,monthly,in progress
,,,yearly,done
Andrew,2017,1nd,monthly,in progress
,,,yearly,in progress
as you can see, the values are not supposed to be there. It should drop on the next line.
And is there any option to force rewrite values so it would be look like this?
Thank you..
Hi there,
My array does not have keys. Not sure if that is the reason why it is not being converted. It generates an empty CSV file. All other more plain vanilla JSON files worked but not this one. Any clues why? Here is my sample:
[
[ 'funding', 'USD', 5.86362147, 0, 5.86362147, null, null ],
[ 'funding', 'BTC', 0.0050514, 0, 0.0050514, null, null ],
[ 'funding', 'BTG', 0.0050514, 0, 0.0050514, null, null ],
[ 'funding', 'BAB', 0.0199, 0, 0.0199, null, null ],
[ 'funding', 'BSV', 0.0099, 0, 0.0099, null, null ],
[
'exchange',
'USD',
114.70389722245558,
0,
114.70389722245558,
'Trading fees for 0.00419623 BTC (BTCUSD) @ 10274.2628 on BFX (0.2%)',
null
],
This is a follow-up to: #64
Thanks so much kaue! This works now for me, for all files I checked.
I did have a follow-up on speed: things slow down for files 1.5M or higher. For me (Win 10 Pro, Surface Pro, 16G RAM, Intel i7-7660U) when I tried a 3.9M file it took 42 minutes (command: jsonexport <json_filename> <output_filename>
. Is this sort of time to complete expected?
I'll provide some background real quick. The original JSON object has a bunch of orders in a JSON array. I loop through this array, as shown in the beginning of the first image, and use the p21ify() Promise to format the data for each individual order. Once it's formatted, I would like to save that data to a .json and a .csv file.
I have an interesting problem, though. If I use the jsonexport() function outside of the p21ify Promise shown in the attached image, it works fine. If I use it inside of this Promise, however, it gives me a call stack size error.
I've also tried JSON.stringify(this) in the jsonexport() call in the other image, but it told me that it "couldn't parse" the object. I'm pretty sure it was working earlier with a regular JS object.
Any ideas? I have a feeling that I'm going into too many layers but I'm not exactly sure how else to do it.
Would it be possible to set a specific order of columns?
Strings like "smth " string" doesn't enclosed with quotes, so any other csv to json parser can't read it propertly. So if a field of a json object contains " (double quote) will not parsed correct.
This problem is probably related to #10 topic.
Having an issue with getting the JSON data to delimit rows. I've tried reading through the documentation for jsonexport and had no luck.
Here is what the JSON data looks like when returned from an API:
{
"data": [
{
"created_time": "2017-09-22T22:32:48+0000",
"from": {
"name": "CENSORED",
"id": "529437227448196"
},
"message": "When you care to have the best service and a quality car visit Lexus of Albuquerque they are great. N H",
"id": "10155842414888534_10155843734438534"
},
{
"created_time": "2017-09-22T18:39:17+0000",
"from": {
"name": "CENSORED",
"id": "134712363931856"
},
"message": "My grandson just upgraded to a Lexus and he says it is the best decision he ever made. He is excited that he doesn't have to worry about oil changes anymore.",
"id": "10155842414888534_10155843129293534"
}
],
"paging": {
"cursors": {
"before": "NjQZD",
"after": "NjMZD"
},
"next": "CENSORED"
}
}
As a result of all the data being inside of "data": {}
it seems to all be stuck on a single row when exported into CSV format.
Here is the CSV output:
https://s26.postimg.org/ywxrqvz3t/Screen_Shot_2017-09-25_at_10.41.08_AM.png
Here is my code:
var options = {
mainPathItem: "data",
verticalOutput: false,
rowDelimiter: ','
};
request({
url: url,
json: true
}, function (error, response, body) {
if (!error && response.statusCode === 200) {
//console.log(body) // Print the json response
var json_response = body;
jsonexport(json_response, options,function(err, csv){
if(err) return console.log(err);
console.log(csv);
fs.writeFile(`exports/${postID}.csv`, csv, function(err) {
if (err) throw err;
console.log('File successfully exported!');
});
});
}
});
Attached is a csv file generated using jsonexport to convert Twitter data (100 tweets) from the Streaming API (in JSON) to the given csv file.
`twitStream.on('tweet', function(tweet) {
counter++;
tweets.push(tweet);
if(counter == count) {
twitStream.stop();
var parsedTweets = [];
for(aTweet in tweets) {
var parsed = {
created_at: tweets[aTweet].created_at,
id: tweets[aTweet].id,
text: tweets[aTweet].text,
user_id: tweets[aTweet].user.id,
user_name: tweets[aTweet].user.name,
user_screen_name: tweets[aTweet].user.screen_name,
user_location: tweets[aTweet].user.location,
user_followers_count: tweets[aTweet].user.followers_count,
user_friends_count: tweets[aTweet].user.friends_count,
user_created_at: tweets[aTweet].user.created_at,
user_time_zone: tweets[aTweet].user.time_zone,
user_profile_background_color: tweets[aTweet].user.profile_background_color,
user_profile_image_url: tweets[aTweet].user.profile_image_url,
geo: tweets[aTweet].geo,
coordinates: tweets[aTweet].coordinates,
place: tweets[aTweet].place
}
parsedTweets.push(parsed);
}
//https://www.npmjs.com/package/jsonexport
jsonexport(parsedTweets, function(err, csv) {
if(err) return console.log(err);
fs.writeFile("wangm13-tweets.csv", csv);
});
res.send(tweets);
}
});`
In the csv file, some of the text fields with newlines are correctly surrounded with quotes (for example on line 46: "RT @LToddWood: Judge Napolitano Returns On Fox News, Stands By Claim Brits Spied On Trump
https://t.co/Bumb2HG8eI"). However, some are not.
Examples:
Line 6
RT @TLBJames: This is EXACTLY the bigoted-GOP-emboldening I knew Trump's coup-by-EC-victory would cause.
And EXACTLY why we're g…
Line 35
Please stop this.
cc @SenMarkey @SenWarren @RepKClark https://t.co/LStc6p6TmW
/api/node_modules/jsonexport/lib/index.js:197
function checkType(element, item){
^
RangeError: Maximum call stack size exceeded
Hi there - Great library! I would love if I could include this in my projects that use ES6 modules.
Hey folks, I am using jsonexport to convert to csv. My timestamp field which comes from a database query is being truncated when I convert to csv. I expect it to stay the same.
What I expect? ( and see when this is printed in json)
"last_modified_date":"2018-10-26T19:17:01.447Z"
What I get?
"last_modified_date":"2018-10-26"
My code is very simple.
jsonexport(output.rows,{rowDelimiter: '|', includeHeaders: false}
I saw that there are handleDate type handler is deprecated. Can someone share some example code of how I can go about solving this?
This module needs to implement streaming to support huge CSVs, currently its using memory to save the csv output https://github.com/kauegimenes/jsonexport/blob/master/lib/index.js#L109
Hi,
I use similar kind of multi-array set. I am using the below input:
var test=[
{
"CODE": "AA",
"GROUP_NAME": "aaaaaa",
"GROUP_DESC": "aaaaaaaaa",
"CHILD_GROUP": [
{
"CODE": "AA",
"GROUP_NAME": "aaaaaa",
"CHILD_CODE": "bb"
},
{
"CODE": "AA",
"GROUP_NAME": "aaaaaa",
"CHILD_CODE": "cc"
},
{
"CODE": "AA",
"GROUP_NAME": "aaaaaa",
"CHILD_CODE": "dd"
}
]
}
]
jsonexport(test,function(err, csv){
if(err) return console.log(err);
console.log(csv);
});
And my output is :
CODE,GROUP_NAME,GROUP_DESC,CHILD_GROUP.CODE,CHILD_GROUP.GROUP_NAME,CHILD_GROUP.CHILD_CODE
AA,aaaaaa,aaaaaaaaa,AA,aaaaaa,bb
,,,AA,aaaaaa,cc
,,,AA,aaaaaa,dd
The first child is getting appended to the parent. Could you please provide a solution for this???
I need the output like this:
AA,aaaaaa,aaaaaaaaa
,,,AA,aaaaaa,bb
,,,AA,aaaaaa,cc
,,,AA,aaaaaa,dd
This package uses ES6 syntax, this suits most but I'm having to jump through hoops.
This file is causing me pain, with its arrow functions : https://github.com/kauegimenes/jsonexport/blob/master/lib/core/helper.js
When using Typescript, the node_modules folder is ignored for casting files. Files in node_modules folder, for Typescript, are expected to be in distribution format and no transpiling takes place.
I use webpack but not babel and do not wish to install babel just to cast this package.
I'm going to make a pull request to dump this library back down. I hope you will except. I will use the fork till then.
Thank you in kind
The vertical output option isn't working.
jsonexport(
// example data
[
{
name: "Paul",
xp: 26
},
{
name: "Joshi",
xp: 87
},
{
name: "Jakob",
xp: 120
}
],
{ rowDelimiter: ";", verticalOutput: true },
(error, csv) => {
if (error) return console.error(error);
console.log(csv);
}
);
The output is with true
and false
:
name;xp
Paul;26
Joshi;87
Jakob;120
But should be:
name;Paul;Joshi;Jakob
xp;26;87;120
Operating system: Windows 10
If source json string fields contain special characters (ñ,ó,à,ô,è) and exported csv file is opened in Excel, the characters are replaced with à ¨ © ± Ã
This seems to be originating from the encoding. If I change the encoding to UTF-8-BOM with a text editor, then excel shows those characters correctly.
I would be nice to have Promise instead of callback
Exporting an object with a Date
throws an error TypeError: this._options.handleDate is not a function
when it should be exported as a local date format as the default handleDate
function is implemented:
jsonexport/lib/parser/handler.js
Lines 181 to 189 in f486a71
import * as jsonexport from 'jsonexport';
const obj: object = {
firstname: "Guillaume",
lastname: "Ongenae",
height: 187,
birth: new Date("01/16/1994"),
};
jsonexport(obj)
.then(console.log)
.catch(console.error);
❯ ts-node index.ts
TypeError: this._options.handleDate is not a function
at Handler.checkComplex (/Users/guillaume/Documents/try/csv/node_modules/jsonexport/lib/parser/handler.js:46:30)
at Handler.check (/Users/guillaume/Documents/try/csv/node_modules/jsonexport/lib/parser/handler.js:96:17)
at Parser._parseObject (/Users/guillaume/Documents/try/csv/node_modules/jsonexport/lib/parser/csv.js:159:35)
at Parser.parse (/Users/guillaume/Documents/try/csv/node_modules/jsonexport/lib/parser/csv.js:34:60)
at /Users/guillaume/Documents/try/csv/node_modules/jsonexport/lib/index.js:76:12
at new Promise (<anonymous>)
at module.exports (/Users/guillaume/Documents/try/csv/node_modules/jsonexport/lib/index.js:75:10)
at Object.<anonymous> (/Users/guillaume/Documents/try/csv/index.ts:10:1)
at Module._compile (internal/modules/cjs/loader.js:1063:30)
at Module.m._compile (/usr/local/lib/node_modules/ts-node/src/index.ts:1043:23)
Just for reference
❯ ts-node index.ts
firstname,Guillaume
lastname,Ongenae
height,187
When I stream multiple JSONs through jsonexport it seems to me that headers are only generated for fields which are present in the first JSON, so fields which are only there on subsequent JSONs, are without a header.
Is it possible to separate arrays in complex arrays into different records?
ie.
var contacts = [{
name: 'Robert',
lastname: 'Miller',
family: null,
location: [1231,3214,4214]
}];
name,lastname,family.name,family.type,family,location,nickname
Robert,Miller,,,,1231
Robert,Miller,,,,3214
Robert,Miller,,,,4214
Hello,
I am using jsonexport library to convert a complex JSON object into csv, and running into some unexpected behavior. Here's my code-
var jsonexport = require('jsonexport');
var contacts = [
{
"id": "101",
"resourceType": "Employee",
"meta": {
"profile": ["1q2w3e4r"],
"tag": [
{
"system": "salary",
"code": "2w1e3r34t"
},
{
"system": "pto",
"code": "33errfee"
},
{
"system": "benefits",
"code": "jiojofirf494r93"
}
]
},
"category": [
{
"coding": [
{
"system": "orgchart",
"org": "sales",
"display": "manager"
}
]
}
],
"benefits": {
"coding": [
{
"system": "ePTO",
"code": "387ry329r82y3",
"display": "6 weeks paid vacation"
},
{
"system": "insurance",
"code": "754689",
"display": "Comprehensive"
}
]
}
},
{
"id": "102",
"resourceType": "Employee",
"meta": {
"profile": ["e3r3434r"],
"tag": [
{
"system": "salary",
"code": "t65y5y"
},
{
"system": "pto",
"code": "er3r32r23r2"
},
{
"system": "benefits",
"code": "fwf3232"
}
]
},
"category": [
{
"coding": [
{
"system": "orgchart",
"org": "sales",
"display": "engineer"
}
]
}
],
"benefits": {
"coding": [
{
"system": "ePTO",
"code": "2w3e323",
"display": "2 weeks paid vacation"
},
{
"system": "insurance",
"code": "t43t43",
"display": "Bare minimum"
}
]
}
}
];
jsonexport(contacts, {
forceTextDelimiter: true,
fillGaps: true,
fillTopRow: true
},function(err, csv){
if(err) return console.log(err);
console.log(csv);
});
The actual output is -
"id","resourceType","meta.profile","meta.tag.system","meta.tag.code","category.coding.system","category.coding.org","category.coding.display","benefits.coding.system","benefits.coding.code","benefits.coding.display"
"101","Employee","1q2w3e4r","salary","2w1e3r34t","orgchart","sales","manager","ePTO","387ry329r82y3","6 weeks paid vacation"
"101","Employee","1q2w3e4r","pto","33errfee",,,,"insurance","754689","Comprehensive"
"101","Employee","1q2w3e4r","benefits","jiojofirf494r93",,,,"insurance","754689","Comprehensive"
"102","Employee","e3r3434r","salary","t65y5y","orgchart","sales","engineer","ePTO","2w3e323","2 weeks paid vacation"
"102","Employee","e3r3434r","pto","er3r32r23r2","orgchart","sales","engineer","insurance","t43t43","Bare minimum"
"102","Employee","e3r3434r","benefits","fwf3232","orgchart","sales","engineer","insurance","t43t43","Bare minimum"
(Pasting tabular format for visibility):
id | resourceType | meta.profile | meta.tag.system | meta.tag.code | category.coding.system | category.coding.org | category.coding.display | benefits.coding.system | benefits.coding.code | benefits.coding.display |
---|---|---|---|---|---|---|---|---|---|---|
101 | Employee | 1q2w3e4r | salary | 2w1e3r34t | orgchart | sales | manager | ePTO | 387ry329r82y3 | 6 weeks paid vacation |
101 | Employee | 1q2w3e4r | pto | 33errfee | insurance | 754689 | Comprehensive | |||
101 | Employee | 1q2w3e4r | benefits | jiojofirf494r93 | insurance | 754689 | Comprehensive | |||
102 | Employee | e3r3434r | salary | t65y5y | orgchart | sales | engineer | ePTO | 2w3e323 | 2 weeks paid vacation |
102 | Employee | e3r3434r | pto | er3r32r23r2 | orgchart | sales | engineer | insurance | t43t43 | Bare minimum |
102 | Employee | e3r3434r | benefits | fwf3232 | orgchart | sales | engineer | insurance | t43t43 | Bare minimum |
jsonexport fails to fill the 2nd and 3rd row for category.coding fields for the first record (see empty cells in rows 2 and 3), but does it for the second. Interestingly, it fills the last object of benefits.coding on the third row for each record. Do you know why jsonexport treats first record fields differently than the second? The JSON structure is identical for both records.
Abhi
I think i found a bug. When i run the following code:
var jsonexport = require('jsonexport');
var invited =
[
{
day : "1",
time : "12:00",
persons :
[
{name: 'Bob',
lastname: 'Smith'},
{name: 'James',
lastname: 'piet'},
{name: "",
lastname: 'Miller'},
{name: 'David',
lastname: 'Martin'}
]
}
]
const options = {fillGaps: true};
jsonexport(invited, options, function (err, csv) {
if (err) return console.log(err);
console.log(csv);
});
The output is:
day,time,persons.name,persons.lastname
1,12:00,Bob,Smith
1,12:00,James,piet
1,12:00,David,Miller
1,12:00,David,Martin
As you can see the third object has an empty string "". But in the output the name "David" is placed. What i expect of course is an empty string.
I hope the explanation is clear.
Thank you in advance,
Piet Kamps
I have a series of JSON which are only about 100k (or greater) in size. But when I try to convert I get an empty file. Smaller files convert fine.
This is from within a Windows 10 environment.
This issue/request most likely will be handled by me. I want checkboxes and/or inputs that control the output of the web based jsonexport interface such as textDelimiter
When compiling code for updated versions of TypeScript, it now checks the require statements and several errors are being thrown:
ERROR in ../node_modules/jsonexport/dist/core/join-rows.js
Module not found: Error: Can't resolve 'os'
ERROR in ../node_modules/jsonexport/dist/parser/csv.js
Module not found: Error: Can't resolve 'os'
ERROR in ../node_modules/jsonexport/dist/core/stream.js
Module not found: Error: Can't resolve 'stream'
ERROR in ../node_modules/jsonexport/dist/parser/csv.js
Module not found: Error: Can't resolve 'stream'
It seems require('stream') and require('os') are not even really needed for the browser. The EOL can just be backslash "\n" in the browser. And stream is not needed if a callback is provided.
I manually edited the node_modules/jsonexport files and all came out well and built to browser. I will need to find the most compatible way to have all files live in harmony and that the dist folder has NO node based requires.
Will have to complete this with-in a weeks time at max. Please be available to publish to npm when I have completed
I'm trying to export objects that contain an id field, whose value is a node Buffer instead of a String. Is there a way I can easily cast the Buffer to a String? Otherwise I get a header for each byte in the buffer.
// JSON
[ { id: 84192,
parent_id: 3551,
product_parent_id: null,
regular: 1,
parent: 0,
child: 0,
configuration: '(none)',
name: 'name',
alias: 'alias',
part_number: 'PD-LPM-0102EP',
description: null,
status: 'Obsolete',
published: 1,
pb_free: 0,
rohs: 0,
hirel: 0,
is_new: 0,
registerable: 1,
navigation: 1,
search: 1 },
{ id: 84193,
parent_id: 3551,
product_parent_id: null,
regular: 1,
parent: 0,
child: 0,
configuration: '(none)',
name: 'name',
alias: 'alias',
part_number: 'PD-NPM-0305EP',
description: null,
status: 'Obsolete',
published: 1,
pb_free: 0,
rohs: 0,
hirel: 0,
is_new: 0,
registerable: 1,
navigation: 1,
search: 1 },
{ id: 84194,
parent_id: 3551,
product_parent_id: null,
regular: 1,
parent: 0,
child: 0,
configuration: '(none)',
name: 'name',
alias: 'alias',
part_number: 'PD-NPM-0306EP',
description: null,
status: 'Obsolete',
published: 1,
pb_free: 0,
rohs: 0,
hirel: 0,
is_new: 0,
registerable: 1,
navigation: 1,
search: 1 },
{ id: 84944,
parent_id: 3551,
product_parent_id: null,
regular: 1,
parent: 0,
child: 0,
configuration: '(none)',
name: 'name',
alias: 'alias',
part_number: 'PD70101ILQ',
description: null,
status: 'In Production',
published: 1,
pb_free: 1,
rohs: 1,
hirel: 0,
is_new: 0,
registerable: 1,
navigation: 1,
search: 1 },
{ id: 84945,
parent_id: 3551,
product_parent_id: null,
regular: 1,
parent: 0,
child: 0,
configuration: '(none)',
name: 'name',
alias: 'alias',
part_number: 'PD70201ILQ',
description: null,
status: 'In Production',
published: 1,
pb_free: 1,
rohs: 1,
hirel: 0,
is_new: 0,
registerable: 1,
navigation: 1,
search: 1 },
{ id: 84195,
parent_id: 3551,
product_parent_id: null,
regular: 1,
parent: 0,
child: 0,
configuration: '(none)',
name: 'name',
alias: 'alias',
part_number: 'PD-NPM-0307EP',
description: null,
status: 'Obsolete',
published: 1,
pb_free: 0,
rohs: 0,
hirel: 0,
is_new: 0,
registerable: 1,
navigation: 1,
search: 1 },
{ id: 84196,
parent_id: 3551,
product_parent_id: null,
regular: 1,
parent: 0,
child: 0,
configuration: '(none)',
name: 'name',
alias: 'alias',
part_number: 'PD-NPM-0308EP',
description: null,
status: 'Obsolete',
published: 1,
pb_free: 0,
rohs: 0,
hirel: 0,
is_new: 0,
registerable: 1,
navigation: 1,
search: 1 },
{ id: 136582,
parent_id: 3551,
product_parent_id: null,
regular: 1,
parent: 0,
child: 0,
configuration: '0',
name: 'name',
alias: 'alias',
part_number: 'PD81001',
description: null,
status: 'In Production',
published: 1,
pb_free: 1,
rohs: 1,
hirel: 0,
is_new: 0,
registerable: 1,
navigation: 1,
search: 1 } ]
//CSV
search,navigation,registerable,is_new,hirel,rohs,pb_free,published,status,part_number,alias,name,configuration,child,parent,regular,parent_id,id
1,1,1,0,0,0,0,1,Obsolete,PD-LPM-0102EP,alias,name,(none),0,0,1,3551,84192
1,1,1,0,0,0,0,1,Obsolete,PD-NPM-0305EP,alias,name,(none),0,0,1,3551,84193
1,1,1,0,0,0,0,1,Obsolete,PD-NPM-0306EP,alias,name,(none),0,0,1,3551,84194
1,1,1,0,0,1,1,1,In Production,PD70101ILQ,alias,name,(none),0,0,1,3551,84944
1,1,1,0,0,1,1,1,In Production,PD70201ILQ,alias,name,(none),0,0,1,3551,84945
1,1,1,0,0,0,0,1,Obsolete,PD-NPM-0307EP,alias,name,(none),0,0,1,3551,84195
1,1,1,0,0,0,0,1,Obsolete,PD-NPM-0308EP,alias,name,(none),0,0,1,3551,84196
1,1,1,0,0,1,1,1,In Production,PD81001,alias,name,0,0,0,1,3551,136582
Currently Say I've json,
{
"firstName":"John"
"lastName":"Joe"
}
While Exporting can I export only "firstName", Is that possible ?
I'm working on a web app and need to convert JSON to CSV, so I grabbed this library and threw it into Webpack. However for some reason if I use it on an array the first column is 'contains' and all subsequent cells in that row are blank.
The issue is reproducible by going here: http://chojax.azurewebsites.net/
and entering 'jsonExport([], console.log)' into your console.
I believe it to be something related to the application however I'm not sure what could be causing this. Using Webpack to export the library into a blank html page does not cause the issue to occur.
Any assistance is appreciated!
var jsonexport = require('jsonexport');
var contacts = [{
name: 'Bob',
lastname: 'Smith',
address: {
number: 1
}
},{
name: 'James',
lastname: 'David'
},{
name: 'Robert',
lastname: 'Miller'
},{
name: 'David',
lastname: 'Martin'
}];
jsonexport(contacts, {
headers: ['lastname', 'name', 'address.number'],
rename: ['Last Name', 'Name', 'Address Number']
},function(err, csv){
if(err) return console.log(err);
console.log(csv);
});
Hi,
I have a json to be converted to csv which seems to be doing fine with jsonexport.
In addition, I have used the headers and rename options to specify the header and equivalent names and its working well.
The json object has a number of fields and I am interested in a subset of them. Is this possible?. For eg, if the column headers are field1, field2, field3.. etc, say I need only field1 and field2. How would I go about selecting them?
Thanks
Vasu
Hi,
I'm trying to convert a large json array to a CSV.
My json array is provided by a mongodb query.
My code :
const stream = mong.getRandomDataWithLimit(10, count); const output = fs.createWriteStream("./data/export/"+name+".csv") stream.pipe(jsonArrayStreams.stringify()).pipe(output);
My CSV doesn't have any header and start at line 3.
What's is wrong please ?
Noticed that on upgrading from 2.4.1
to 2.5.2
(and also 3.0.0
) that the library has started to convert columns with integer values of 0 into "".
From a quick look through the code I think this happened due to
const fillAndPush = (row) => rows.push(row.map(col => col || ''));
https://github.com/kaue/jsonexport/blob/master/lib/parser/csv.js#L89
committed in
I get this using webpack 3.11
ERROR in static/js/vendor.dcc406ea8a86a42719e0.js from UglifyJs
Unexpected token: name (endOfLine) [./node_modules/jsonexport/lib/core/escape-delimiters.js:13,0][static/js/vendor.dcc406ea8a86a42719e0.js:641,6]
When I install jsonexport via npm (even with --production), there is always are big (ranging between 300 and 500 MB) json file, which just seems to be for the testing code. That's just ridiculous and makes deployment on e.g. AWS lambda harder than necessary.
File project_path/node_modules/jsonexport/data.json
When passing an array of objects that contain non-string values, a fatal error is thrown on version 1.5.0.
[ { "id": 1, "status": "processing"}, {"id": 2, "status":"complete"} ];
fails with:
.../node_modules/jsonexport/lib/escape-delimiters.js:29
var newValue = value.replace(textDelimiterRegex, escapedDelimiter);
TypeError: value.replace is not a function
I can use it as:
import * as jsonexport from 'jsonexport';
const csvData = await new Promise((resolve, reject) => jsonexport(data, (err, csv) => {
if (err) {
reject(err);
} else {
resolve(csv);
}
}));
Any better suggestions?
I have an id field and many of the ids lead with one or more zeros. when i import the csv to excel, these get cut off. normally i would wrap the field in double quotes or a leading single quote to indicate it is a text type. but your nifty exporter handles these special characters differently. how would i make it so my field could retain the leading zeros?
for example I have this kind of json
[ {
"name" : "reports-observation",
"version" : "1.0.0",
"description" : "",
"main" : "server.js",
"scripts" : {
"test" : "echo \"Error: no test specified\" && exit 1",
"start" : "node server.js"
},
"author" : "",
"license" : "ISC",
"dependencies" : {
"csv" : "^0.4.2",
"csv-generate" : "0.0.4",
"exceljs" : "^0.1.11",
"express" : "^4.12.3",
"fast-csv" : "^0.6.0",
"hapi" : "^8.4.0",
"should" : "^6.0.1"
}
}, {
"name" : "reports-observation",
"version" : "1.0.0",
"description" : "",
"main" : "server.js",
"scripts" : {
"test" : "echo \"Error: no test specified\" && exit 1",
"start" : "node server.js"
},
"author" : "",
"license" : "ISC",
"dependencies" : {
"csv" : "^0.4.2",
"csv-generate" : "0.0.4",
"exceljs" : "^0.1.11",
"express" : "^4.12.3",
"fast-csv" : "^0.6.0",
"hapi" : "^8.4.0",
"should" : "^6.0.1"
}
} ]
the first displayed is the "dependencies field instead of the "name" field.
p.s. if I want this to create an xls/xlsx file how would I do that?
This is the JSON:
[{
"orderId": "111-8763976-4506632",
"version3.0": {
"customizationInfo": {
"surfaces": [
{
"name": "Angled Above Handle",
"areas": [
{
"name": "Choose Knife",
"optionImage": ""
}
]
},
{
"name": "Right Face",
"areas": [
{
"name": "Right Face Engraving",
"optionImage": "https://m.media-amazon.com/images/S/pc-vendor-gallery-prod/A17JMGFJQ7S29T/options/2018-08-24/ef64d10d-7668-42f1-a0ea-6c9971c13a0f.png"
}
]
}
]
}
}
}]
and the output we are getting is
See the column "customizationInfo.surfaces.areas.optionImage". The value should be there on the 2 row instead of first
Thanks
I saw an issue where one of my json entries contained a newline - example:
"q10":"blahblahblahe\n",
which seems to not get handled properly. Could I be missing an option to handle this or is there an issue with the character embedded in the string literal (or perhaps my json is not properly escaped - I retrieve via an external API which may be violating the json spec - but it looks right).
It should be valid in JSON to have a string containing a newline.
see #54
Is there a way I can parse only specific headers?
For instance, only convert name and family to csv format.
Expected output : name, family.name, family.type
var contacts = [{
name: 'Bob',
lastname: 'Smith',
family: {
name: 'Peter',
type: 'Father'
}
},{
name: 'James',
lastname: 'David',
family:{
name: 'Julie',
type: 'Mother'
}
},{
name: 'Robert',
lastname: 'Miller',
family: null,
location: [1231,3214,4214]
},{
name: 'David',
lastname: 'Martin',
nickname: 'dmartin'
}];
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.