Code Monkey home page Code Monkey logo

trdsql's Issues

feature request: automatically quote columns

Hello,

I've jsut discovered this great tool, which is incredible, except for this ugly ` quote character required when column names are not fully lowercased, which is really awful in our bash scripts...

I wonder if it was possible to have some sort of generic way of quoting columns, that can be automatically converted to the good column quote depending on the DB driver.

The idea is to allow something like SELECT [Id], [Name] FROM something and have it converted automatically to " or `.

Maybe the [...] is not the good solution, if some drivers support it, so we may want to switch to something else like {...} or |...|.
It should be something that can be used for all known DB drivers.

The complex thing is to parse the SQL string to detect what is inside strings and what is not, to perform the replacement only on the SQL part of the query.

What do you think?

Can't you use the WHERE statement?

func main() {
	jsonString := `
{
  "deviceInfo": {
    "deviceName": "设备3",
    "edgeDeviceName": "设备three",
    "topic": "GG2",
    "encryption": false,
    "compression": false
  },
  "telemetry": {
    "ts": 1585194439000,
    "aa": 6,
    "bb": 10,
    "OPC温度": 10370,
    "OPC湿度": "86",
    "OPC电量": true,
	"staType":0
  }
}
`
	trdsql.EnableDebug()
	r := bytes.NewBufferString(jsonString)
	importer, err := trdsql.NewBufferImporter("test", r, trdsql.InFormat(trdsql.JSON))
	if err != nil {
		log.Fatal(err)
	}
	writer := trdsql.NewWriter(trdsql.OutFormat(trdsql.VF))
	trd := trdsql.NewTRDSQL(importer, trdsql.NewExporter(writer))

	err = trd.Exec("SELECT telemetry FROM test where telemetry.aa > 3")
	if err != nil {
		log.Fatal(err)
	}
}

Out :

2021/10/20 10:54:13 driver: sqlite3, dsn: 
2021/10/20 10:54:13 CREATE TEMPORARY TABLE test ( `deviceInfo` text, `telemetry` text );
2021/10/20 10:54:13 INSERT INTO test (`deviceInfo`, `telemetry`) VALUES (?,?);
2021/10/20 10:54:13 SELECT telemetry FROM test where telemetry.aa > 3
2021/10/20 10:54:13 export: no such column: telemetry.aa [SELECT telemetry FROM test where telemetry.aa > 3]
exit status 1

Multiple queries

Please support multiple queries. Currently, it seems that only last query is output, e.g.

$ trdsql 'select 1; select 2,3'
2,3

Note: union is not used here because of different number of columns.

A workaround is to run multiple commands (e.g. trdsql 'select 1'; trdsql 'select 2,3'), but database will be created multiple times. And if input is from stdin, you have to save it first to share same input for each command.

Standard Deviation

While putting SQL, I need to compute standard deviation of some columns like: "select stddev(c1), stddev(c2) from test.csv"

It displays: ERROR(EXPORT):SQL:no such function: stddev

Ignores nullable columns

Hello,

Great tool, I was exploring some data, and stumbled upon an issue, when optional columns have been dropped, here is a basic repro case:

$ echo -e '{"method": "GET"}\n{"method": "POST", "meta.user": "with.dot"}' | trdsql "SELECT * FROM -"
"{""method"": ""GET""}"
"{""method"": ""POST"""

is this something expected?

Hi is there any versions of trdsql that work with "GLIBC_2.28"?

Hi,

Thank you so much for this product. The team I work on has used this quite a bit. I have had to upgrade to Oracle Linux 8.8 and I am trying to get trdsql installed on this box.

I have downloaded the latest version and I am running into the error
/lib64/libm.so.6: version `GLIBC_2.29' not found (required by ./trdsql)

I have subsequently downloaded earlier versions of the *_linux_amd64.zip and run into the same problem each time. Are you aware of a version or something I can do to get this to run with glibc_2.28?

Thank you so much for your time and effort,
Joe

UTF-8 Byte Order Markers should be ignored

If a data file includes Byte Order Markers (in my case, UTF-8 BOM ef bb bf), those bytes should be ignored by trdsql, but instead are currently treated as part of the data file:

$ trdsql -ih 'select [Service Type] from data.csv limit 1'
2021/05/13 10:37:05 export: no such column: Service Type [select [Service Type] from `data.csv` limit 1]

$ sed -n '1{s/,.*//; p;}' data.csv  | tee >(hexyl)
Service Type
┌────────┬─────────────────────────┬─────────────────────────┬────────┬────────┐
│00000000│ ef bb bf 53 65 72 76 69 ┊ 63 65 20 54 79 70 65 0a │×××Servi┊ce Type_│
└────────┴─────────────────────────┴─────────────────────────┴────────┴────────┘

trdsql currently thinks the "Service Type" column name is prefixed with the 3 byte BOM:

$ col=$(printf "%b" '\xef\xbb\xbfService Type'); echo $col | hexyl; trdsql -ih -oat "select [$col] from data.csv limit 1"
┌────────┬─────────────────────────┬─────────────────────────┬────────┬────────┐
│00000000│ ef bb bf 53 65 72 76 69 ┊ 63 65 20 54 79 70 65 0a │×××Servi┊ce Type_│
└────────┴─────────────────────────┴─────────────────────────┴────────┴────────┘
+--------------------------------+
|          Service Type          |
+--------------------------------+
| Construction - General         |
+--------------------------------+

I don't know if other encodings also have byte order markers, but I see that Go's standard library will apparently never handle BOMs so each application must deal with them individually 😞

Backquotes in output when selecting from table containing dashes

Hardcoded values containing a dash are unexpectedly output with backquotes when selecting from

  • stdin
  • a filename containing one or more dashes

Table data containing dashes is correctly output without any added backquotes.

I can work around it by enclosing my filenames in quotes when selecting directly from them, but that same method does not work to quote stdin.

Some examples:

$ FILE=/tmp/foo-bar.csv  # filename containing a dash
$ echo "x,y-z" > $FILE
$ cat $FILE | trdsql -oat "select '$FILE' as stdin_quoted_dashes, * from -"                                                                                                                                                             
+---------------------+----+-----+
| stdin_quoted_dashes | c1 | c2  |
+---------------------+----+-----+
| /tmp/foo`-`bar.csv  | x  | y-z |
+---------------------+----+-----+

$ trdsql -oat "select '$FILE' as unquoted_file_full_string_quoted, * from $FILE"                                                                                                                                                        
+----------------------------------+----+-----+
| unquoted_file_full_string_quoted | c1 | c2  |
+----------------------------------+----+-----+
| `/tmp/foo-bar.csv`               | x  | y-z |
+----------------------------------+----+-----+

$ trdsql -oat "select '$FILE' as quoted_file_ok, * from \"$FILE\""                                                                                                                                                                      
+------------------+----+-----+
|  quoted_file_ok  | c1 | c2  |
+------------------+----+-----+
| /tmp/foo-bar.csv | x  | y-z |
+------------------+----+-----+

$ # but can't use the same workaround for stdin:(
$ cat $FILE | trdsql -oat "select '$FILE' as quoted_stdin, * from \"-\""                                                                                                                                                                
2019/08/26 08:40:46 ERROR(EXPORT):SQL:no such table: -
[select '/tmp/foo-bar.csv' as quoted_stdin, * from "-"]

$ uname -srvmpio
Linux 4.9.184-linuxkit #1 SMP Tue Jul 2 22:58:16 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux

$ trdsql --version
0.6.3

Brew SHA256 Mismatch

Hello,

It seems that the latest version "v0.12.0" released has bad SHA256.

==> Fetching noborus/tap/trdsql
==> Downloading https://github.com/noborus/trdsql/releases/download/v0.12.0/trdsql_v0.12.0_darwin_amd64.zip
==> Downloading from https://objects.githubusercontent.com/github-production-release-asset-2e65be/96136924/ebbecaeb-a282-476d-865b-1654176086d7?X-Amz-Algorith
####################################################################################################################################################### 100.0%
Error: SHA256 mismatch
Expected: fa033118104e50989a52d004e42b0f4eb9936b317dc650322732c254a8b6c9a2
  Actual: 977c8652f8cdd6868d68d25dd84e16f519e8e628c9183937b5cb66dadc840e5a
    File: /Users/aperrier/Library/Caches/Homebrew/downloads/a6ee52ac999d0542717ae953fbfc73793c3d9918136ae950ebfb7aa365bc5eb7--trdsql_v0.12.0_darwin_amd64.zip
To retry an incomplete download, remove the file above.

SQL joins are not reflected in json output

When using the join from the example, the json output is different than the csv output (only the latter is correct)

❯ cat hist.csv
1,2017-7-10
2,2017-7-10
2,2017-7-11%

❯ cat user.csv
1,userA
2,userB%

❯ docker run --rm -it -v $(pwd):/tmp noborus/trdsql "SELECT u.c1,u.c2,h.c2 FROM /tmp/user.csv as u LEFT JOIN /tmp/hist.csv as h ON(u.c1=h.c1)"
1,userA,2017-7-10
2,userB,2017-7-10
2,userB,2017-7-11


❯ docker run --rm -it -v $(pwd):/tmp noborus/trdsql -ojson "SELECT u.c1,u.c2,h.c2 FROM /tmp/user.csv as u LEFT JOIN /tmp/hist.csv as h ON(u.c1=h.c1)"
[
  {
    "c1": "1",
    "c2": "2017-7-10"
  },
  {
    "c1": "2",
    "c2": "2017-7-10"
  },
  {
    "c1": "2",
    "c2": "2017-7-11"
  }
]

Want to get Integer values from JSON data instead of exponential notation

About

  • I want to get Integer values from JSON data like CSV parser

Version

% trdsql --version
trdsql version v0.8.0

CSV

data.csv

id,name
1000000,"foo"
1000001,"bar"

Result

  • The results are integer values
% trdsql -icsv -ih "SELECT * FROM data.csv"
1000000,foo
1000001,bar

JSON

data.json

[
  {
    "id": 1000000,
    "name": "foo"
  },
  {
    "id": 1000001,
    "name": "bar"
  }
]

Result

  • The results are exponential notation
    • 1e+06
    • 1.000001e+06
  • I want to get these values as integer
 % trdsql -ijson "SELECT * FROM data.json"
1e+06,foo
1.000001e+06,bar

Querying set of CSVs which a lot of entries does not output any json

I am working on a medium-large open data set of Belgian companies, which is freely downloadable at https://kbopub.economie.fgov.be/kbo-open-data/login

I am trying to execute following query to denormalize data into 1 file and it works fine when the output target is CSV, but nothing happens when outputting JSON:

docker run --rm -it -v $(pwd):/tmp noborus/trdsql -ojson -ih " \
SELECT * FROM /tmp/enterprise.csv as e \
LEFT JOIN /tmp/address.csv as a ON (e.EnterpriseNumber=a.EntityNumber) \
LEFT JOIN /tmp/contact.csv as c ON (e.EnterpriseNumber=c.EntityNumber) \
LEFT JOIN /tmp/denomination.csv as d ON (e.EnterpriseNumber=d.EntityNumber) \
";

Any ideas?

"null" representation

Please add options to specify null representation (e.g. -in _, -on _ to use underscores as nulls in input, ouput (in the same way as -id, -od for delimiter)).

It seems that, for output, null and empty string cannot be differentiate; and there is no way to specify null in input.

$ trdsql 'select 1,null,2,"",3'
1,,2,,3

$ echo '1,,2,,3' | trdsql 'select coalesce(c2,"null"),iif(c4=="","empty", c4) from -'
,empty

A workaround is to do it manully, e.g. with coalesce, nullif, but it is tedious to do so for all columns.

$ trdsql 'select 1,coalesce(null, "_"),2,"",3'
1,_,2,,3

$ echo '1,_,2,3' | trdsql 'select coalesce(nullif(c2,"_"), "NULL") from -'
NULL

build error

-GOOS:linux
-GOPACH:amd64
--github.com/mattn/go-sqlite3
github.com\mattn\go-sqlite3\sqlite3_go18.go:18:10: undefined: SQLiteConn
github.com\mattn\go-sqlite3\sqlite3_go18.go:26:10: undefined: SQLiteConn
github.com\mattn\go-sqlite3\sqlite3_go18.go:27:17: undefined: namedValue
github.com\mattn\go-sqlite3\sqlite3_go18.go:29:13: undefined: namedValue
github.com\mattn\go-sqlite3\sqlite3_go18.go:35:10: undefined: SQLiteConn
github.com\mattn\go-sqlite3\sqlite3_go18.go:36:17: undefined: namedValue
github.com\mattn\go-sqlite3\sqlite3_go18.go:44:10: undefined: SQLiteConn
github.com\mattn\go-sqlite3\sqlite3_go18.go:49:10: undefined: SQLiteConn
github.com\mattn\go-sqlite3\sqlite3_go18.go:54:10: undefined: SQLiteStmt
github.com\mattn\go-sqlite3\sqlite3_go18.go:63:10: undefined: SQLiteStmt
github.com\mattn\go-sqlite3\sqlite3_go18.go:36:17: too many errors

binary is not completely statically compiled

$ trdsql -ijson "select verbose_name from fudaneducn.json group by verbose_name"
trdsql: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.28' not found (required by trdsql)
trdsql: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by trdsql)
trdsql: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.33' not found (required by trdsql)

It depends on a particular version of glibc

my glibc info

strings /lib/x86_64-linux-gnu/libc.so.6 | grep GLIBC
GLIBC_2.2.5
GLIBC_2.2.6
GLIBC_2.3
GLIBC_2.3.2
GLIBC_2.3.3
GLIBC_2.3.4
GLIBC_2.4
GLIBC_2.5
GLIBC_2.6
GLIBC_2.7
GLIBC_2.8
GLIBC_2.9
GLIBC_2.10
GLIBC_2.11
GLIBC_2.12
GLIBC_2.13
GLIBC_2.14
GLIBC_2.15
GLIBC_2.16
GLIBC_2.17
GLIBC_2.18
GLIBC_2.22
GLIBC_2.23
GLIBC_2.24
GLIBC_2.25
GLIBC_2.26
GLIBC_2.27
GLIBC_PRIVATE
GNU C Library (Ubuntu GLIBC 2.27-3ubuntu1) stable release version 2.27.

Not able to use JSON_EXTRACT

Hi,
first of all thank you for this great tool.

I'm trying to use the documentation page to reply JSON_EXTRACT example,
My input is

[
  {
    "id": 1,
    "name": "Drolet",
    "attribute": { "country": "Maldives", "color": "burlywood" }
  },
  {
    "id": 2,
    "name": "Shelly",
    "attribute": { "country": "Yemen", "color": "plum" }
  },
  {
    "id": 3,
    "name": "Tuck",
    "attribute": { "country": "Mayotte", "color": "antiquewhite" }
  }
]

If I run the below command I have nothing.

trdsql -ijson "SELECT id, name, JSON_EXTRACT(attribute,'$country'), JSON_EXTRACT(attribute,'$color') FROM sample2.json"

I'm using trdsql version v0.9.0.

What's wrong in my command?

Thank you

Error: Cannot tap noborus/tap: invalid syntax in tap!

==> Tapping noborus/tap
Cloning into '/usr/local/Homebrew/Library/Taps/noborus/homebrew-tap'...
remote: Enumerating objects: 181, done.
remote: Counting objects: 100% (75/75), done.
remote: Compressing objects: 100% (65/65), done.
remote: Total 181 (delta 42), reused 24 (delta 10), pack-reused 106
Receiving objects: 100% (181/181), 27.49 KiB | 5.50 MiB/s, done.
Resolving deltas: 100% (98/98), done.
Error: Invalid formula: /usr/local/Homebrew/Library/Taps/noborus/homebrew-tap/pgsp.rb
pgsp: Calling bottle :unneeded is disabled! There is no replacement.
Please report this issue to the noborus/tap tap (not Homebrew/brew or Homebrew/core):
/usr/local/Homebrew/Library/Taps/noborus/homebrew-tap/pgsp.rb:9

Error: Cannot tap noborus/tap: invalid syntax in tap!

SQLite "cannot commit - no transaction is active" on large data sets

When a sufficiently large amount of data is passed to trdsql configured to use SQLite, an error is thrown indicating that there is no active transaction to commit.

Here's a demonstration. Although pulling from /dev/urandom, the behavior was consistent. Worked fine for any number less than or equal to 78768, and threw the error for larger numbers. (I would not be surprised if the cutoff number were different in every environment though.)

for byteCnt in 78768 78769
do
	dd if=/dev/urandom bs=$byteCnt count=1 |od -tx1 |sed 's/ /\x0a/g;' |
	trdsql -oat 'select count(*) as RecCnt, c1 as Val from - group by c1 order by count(*) desc limit 5';
done;
1+0 records in
1+0 records out
78768 bytes (79 kB, 77 KiB) copied, 0.0055778 s, 14.1 MB/s
+--------+-----+
| RecCnt | Val |
+--------+-----+
|    353 | a1  |
|    348 |  52 |
|    348 | d8  |
|    347 | 8a  |
|    345 |  81 |
+--------+-----+
1+0 records in
1+0 records out
78769 bytes (79 kB, 77 KiB) copied, 0.0052549 s, 15.0 MB/s
+--------+-----+
| RecCnt | Val |
+--------+-----+
+--------+-----+
2019/10/09 16:26:31 ERROR(COMMIT):cannot commit - no transaction is active

I looked briefly at the relevant code but didn't see anything alarming...

  • trdsql.go: Exec()
  • importer.go: ImportFile()
  • database.go: Import(), insertImport()

`-ijq` does not work correctly from stdin

Hi! Thanks for creating trdsql - it's awesome!

I just discovered this tool and I think it will on many occasions replace jq for me. But I think I've found a minor bug in the current version 0.9.1.

It's best shown using an example.

This works:

$ cat customer_basedata.json | trdsql -ijson 'SELECT id, name FROM -::.entities'

Also works:

$ trdsql -ijq '.entities' 'SELECT id, name FROM customer_basedata.json'

This doesn't work:

$ cat customer_basedata.json | trdsql -ijq '.entities' 'SELECT id, name FROM -'

Error message:

2022/04/22 14:17:01 import: too many columns on -::jq1

By the way, version 0.9.1 is missing on docker hub :)

feature request: support input format yaml file

trdsql is awesome tools!!!

Can you add support for yaml files? Sometimes I need to deal with yaml file, and I need other tool to transfer yaml to json file. It would be better if yaml is supported by trdsql directly.

Make this project dependency ready

I would like to use this as a dependency of my own application, however I cannot use it as such because all the code is in the main package.

Please move supporting code to it's own package and reference from your main.go so this can be used from other go programs.

Moreover, structs appear to use all private fields, so those would need to be updated as well.

Please also provide an option to supply an input stream to the TRDSQL struct, instead of assuming a file.

Grouping entries with same key inside json output

This one is related to a comment in issue #100

When targetting json, a flat array with objects are returned, but when using joins, it would be interesting to return an array with a list of objects, each containing an array of objects coming from the joined tables.

Example:

❯ docker run --rm -it -v $(pwd):/tmp noborus/trdsql -ojson "SELECT u.c1 as id,u.c2 as name,h.c2 as date FROM /tmp/user.csv as u LEFT JOIN /tmp/hist.csv as h ON(u.c1=h.c1)" > out.json

results in:

❯ cat out.json
[
  {
    "date": "2017-7-10",
    "id": "1",
    "name": "userA"
  },
  {
    "date": "2017-7-10",
    "id": "2",
    "name": "userB"
  },
  {
    "date": "2017-7-11",
    "id": "2",
    "name": "userB"
  }
]

But it would be cool to have the following output, using the filenames or maybe aliases ('h' as alias for 'hist.csv') as object keys for the nested objects:

[
  {
    "date": "2017-7-10",
    "id": "1",
    "name": "userA"
  },
  {
    "id": "2",
    "name": "userB",
    "h": [
      {
        "date": "2017-7-10"
      },
      {
        "date": "2017-7-11"
      }
    ]
  }
]

Having multiple left joins gives more nested objects on the same level as 'h' above.

-driver postgres returns SSL error

trdsql -driver postgres -ih -ocsv "SELECT * FROM file.csv WHERE lpad(a, 9, '0') != replace(b, '-', '') LIMIT 10"
2021/06/02 13:03:49 begin: pq: SSL is not enabled on the server

I'm not sure what I'm supposed to do here. I just wanted to use the lpad function thus I added the -driver option. Setting up SSL on a localhost instance seems overkill.

How would I use as Library to output to CSV file?

I'm using trying to use the library. I can select the fields but I can't export it to the CSV file or I do not understand it correctly.

func main() {
	trd := trdsql.NewTRDSQL(
		trdsql.NewImporter(trdsql.InDelimiter(",")),
		trdsql.NewExporter(trdsql.NewWriter()),
	)
	err := trd.Exec("SELECT COUNT(*) FROM sample.csv")
	if err != nil {
		log.Fatal(err)
	}
}

How can I output the result to CSV?

Support shell text utilities for string functions

The sqlite string functions are not powerful.
How about provide a function to use shell utilities like grep, awk, sed etc.?
ex.
trdsql -ih -oh "select shell(grep -oP '[[:digit:]]{5}', c1) as num from d.csv"

CSV Column Name not recognized

I'm not sure what I'm doing wrong.

I'm trying to read from a csv with the columns Engine_name, Variable_name and Value.
However, for some reason the Engine_name column is not recognized as a header column.
Here are the queries I tried:
Get some of in on the data to confirm the data is there: trdsql -ih -a mysql_5.7.x_settings.csv
Output:
The table name is mysql_5.7.x_settings.csv.
The file type is CSV.

Data types:
+-------------------+------+
| column name | type |
+-------------------+------+
| `Engine_name` | text |
| `Variable_name` | text |
| `Value` | text |
+-------------------+------+

Data samples:
+-----------------+---------------------------------------+-------------------------------+
| `Engine_name` | `Variable_name` | `Value` |
+-----------------+---------------------------------------+-------------------------------+
| MySQL 5.7.X | aurora_backtrace_compare_metrics_file | backtrace_compare_metrics.txt |
+-----------------+---------------------------------------+-------------------------------+

Examples:
trdsql -ih "SELECT `Engine_name`, `Variable_name`, `Value` FROM mysql_5.7.x_settings.csv"
trdsql -ih "SELECT `Engine_name`, `Variable_name`, `Value` FROM mysql_5.7.x_settings.csv WHERE `Engine_name` = 'MySQL 5.7.X'"
trdsql -ih "SELECT `Engine_name`, count(`Engine_name`) FROM mysql_5.7.x_settings.csv GROUP BY `Engine_name`"
trdsql -ih "SELECT `Engine_name`, `Variable_name`, `Value` FROM mysql_5.7.x_settings.csv ORDER BY `Engine_name` LIMIT 10"

I tried:
trdsql -ih "SELECT `Engine_name`, `Variable_name`, `Value` FROM mysql_5.7.x_settings.csv ORDER BY `Engine_name` LIMIT 10"
2024/04/17 09:29:08 export: no such column: Engine_name [SELECT Engine_name, Variable_name, Value FROM mysql_5.7.x_settings.csv ORDER BY Engine_name LIMIT 10]

I tried:
trdsql -ih "SELECT Engine_name, Variable_name, Value FROM mysql_5.7.x_settings.csv"
2024/04/17 09:29:27 export: no such column: Engine_name [SELECT Engine_name, Variable_name, Value FROM mysql_5.7.x_settings.csv]

Btw, great job on this tool. I just started using and sharing it internally.

build Dockerfile with multi-stage ? It also supports ARM64

this is my dockerfile for test.

But the mirror image is still huge, about 800M

FROM --platform=$BUILDPLATFORM golang:1.16 as builder
ENV TZ=Asia/Shanghai LANG="C.UTF-8"
ARG TARGETARCH
ARG TARGETOS

WORKDIR /workspace
# Copy the Go Modules manifests
COPY go.mod go.mod
COPY go.sum go.sum
ENV GO111MODULE=on
ENV GOPROXY=https://goproxy.io,direct

# Copy the go source
COPY query query

# Build
RUN CGO_ENABLED=1 GOOS=$TARGETOS GOARCH=$TARGETARCH go build --tags='json1' -a -o queryd query/cmd/main.go

FROM golang:1.16-buster
WORKDIR /
COPY --from=builder /workspace/queryd .
ENTRYPOINT ["/queryd"]

Consecutive Tab Delimiters Collapsed To A Single Character

When querying a tab-delimited file, if there's an empty column, the rest of the row shifts over one position. Works fine as long as there is data in every column.

echo 'a,b,c,d 1,2,3,4 5,,,8 9,A,B,C' | sed 's/,/\t/g; s/ /\n/g;' |trdsql -ih -id '\t' -oat 'select * from -'
+---+---+---+---+
| a | b | c | d |
+---+---+---+---+
| 1 | 2 | 3 | 4 |
| 5 | 8 |   |   | <-- value from column "d" is output under column "b"
| 9 | A | B | C |
+---+---+---+---+

For completeness, I tested all the ASCII values 1-126 as delimiters. The problem only occurs for a few values:

  • 9 (Horizontal Tab)
  • 11 (Vertical Tab)
  • 12 (Form Feed)
  • 32 (Space)
for ord in $(seq 1 126)
do
	[ $ord -eq 10 ] && continue; # linefeed
	[ $ord -eq 13 ] && continue; # carriage return
	[ $ord -eq 34 ] && continue; # double quote
	[ $ord -eq 39 ] && continue; # single quote
	[ $ord -eq 49 ] && continue; # the number 1 (our data below)

	delim=$(printf \\$(printf '%03o' $ord));
	[ $ord -eq 92 ] && delim="\\$delim";

	sedDelim=$delim;
	[ "$delim" = "/" ] && sedDelim="\/";
	[ "$delim" = "&" ] && sedDelim="\&";

	output=$(
		echo "1,,1" | sed "s/,/$sedDelim/g;" |
		trdsql -id "$delim" 'select * from -';
	);

	[ "$output" = "1,,1" ] || echo "$output for ord $ord";
done;
1,1 for ord 9
1,1 for ord 11
1,1 for ord 12
1,1 for ord 32

While I've never encountered a file delimited by VT, FF or Space, I regularly encounter tab delimited files, so this is a fairly important issue.

Specifying column types

Please add an option to auto-detect column types (e.g. -itype), or specify numeric column types (e.g. -inum c2,c3).

Currently, it seems that column types are not auto-detected, which may cause wrong order, e.g.

$ d='a,1
b,2
c,11'

$ <<<$d trdsql -a -
The table name is -.
The file type is CSV.

Data types:
+-------------+------+
| column name | type |
+-------------+------+
| c1          | text |
| c2          | text |
+-------------+------+
…

$ <<<$d trdsql 'select * from - order by c2'
a,1
c,11
b,2

$ <<<$d trdsql 'select * from - order by cast(c2 as number)'
a,1
b,2
c,11

multiple files syntax error

Hello, first of all great tool! I use it a lot. If try to work with multiple files I get an error.

$ trdsql -ih -id ";" "SELECT * FROM *csv"
2019/07/31 16:41:55 ERROR: SQL:near "*": syntax error
[SELECT * FROM *csv]

This works fine.

$ trdsql -ih -id ";" "SELECT * FROM file.csv"

Did I something wrong?

Cannot use the tool, process gets instantly killed with 137 exit code

Even if I run just trdsql -version it exists with a 137 exit code instantly. Any ideas? I'm using 0.10.0 and have installed (and tried to reinstall it already) via Homebrew.

I have used an older version in the past without issues. It's not a memory problem on my computer as I have plenty of RAM available.

~ ❯ trdsql -version                                                                                                                                                                                                        ✘ KILL 05:30:11 PM
[1]    16749 killed     trdsql -version

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.