Code Monkey home page Code Monkey logo

libpq.jl's Introduction

LibPQ.jl Logo

LibPQ

LibPQ.jl is a Julia wrapper for the PostgreSQL libpq C library.

Stable In Development CI CodeCov

Features

Current

  • Build
    • Installs libpq via BinaryBuilder.jl for MacOS, GNU Linux, and Windows
  • Connections
    • Connect via DSN
    • Connect via PostgreSQL connection string
    • UTF-8 client encoding
  • Queries
    • Create and execute queries with or without parameters
    • Execute queries asynchronously
    • Stream results using Tables
    • Configurably convert a variety of PostgreSQL types to corresponding Julia types (see the Type Conversions section of the docs)
  • Prepared Statements
    • Create and execute prepared statements with or without parameters
    • Stream table of parameters to execute the same statement multiple times with different data

Goals

Note that these are goals and do not represent the current state of this package

LibPQ.jl aims to wrap libpq as documented in the PostgreSQL documentation, including all non-deprecated functionality and handling all documented error conditions. Where possible, asynchronous functionality will be wrapped in idiomatic Julia control flow. All Oids returned in query results will have type conversions (to String by default) defined, as long as I can find documentation on their structure. Some effort will be made to integrate with other packages (e.g., Tables, already implemented) to facilitate conversion from query results to a malleable format.

Non-Goals

LibPQ.jl will not attempt to conform to a standard database interface, though anyone is welcome to write a PostgreSQL.jl library to wrap this package.

This package will not:

  • parse SQL
  • emit SQL
  • provide an interface for handling transactions or cursors
  • provide abstractions over common SQL patterns

Possible Goals

This package may not:

  • test on multiple install configurations
  • aim to support any particular versions of libpq or PostgreSQL
  • support conversion from some Oid to some type
  • provide easy access to every possible connection method
  • be as memory-efficient as possible

While I may never get to any of these, I welcome tested, documented contributions!

Licenses

libpq Source and PostgreSQL Documentation

PostgreSQL is Copyright © 1996-2017 by the PostgreSQL Global Development Group.

Postgres95 is Copyright © 1994-5 by the Regents of the University of California.

Permission to use, copy, modify, and distribute this software and its
documentation for any purpose, without fee, and without a written agreement is
hereby granted, provided that the above copyright notice and this paragraph
and the following two paragraphs appear in all copies.

IN NO EVENT SHALL THE UNIVERSITY OF CALIFORNIA BE LIABLE TO ANY PARTY FOR
DIRECT, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES, INCLUDING
LOST PROFITS, ARISING OUT OF THE USE OF THIS SOFTWARE AND ITS DOCUMENTATION,
EVEN IF THE UNIVERSITY OF CALIFORNIA HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGE.

THE UNIVERSITY OF CALIFORNIA SPECIFICALLY DISCLAIMS ANY WARRANTIES, INCLUDING,
BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE. THE SOFTWARE PROVIDED HEREUNDER IS ON AN “AS-IS” BASIS,
AND THE UNIVERSITY OF CALIFORNIA HAS NO OBLIGATIONS TO PROVIDE MAINTENANCE,
SUPPORT, UPDATES, ENHANCEMENTS, OR MODIFICATIONS.

Everything Else

The license for the remainder of this package appears in LICENSE.

libpq.jl's People

Contributors

ararslan avatar azure-pipelines[bot] avatar c42f avatar cgopalan avatar chris-b1 avatar expandingman avatar fchorney avatar galenlynch avatar iamed2 avatar ianbutterworth avatar jd-foster avatar juliatagbot avatar kozross avatar krynju avatar lbilli avatar mattbrzezinski avatar mjram0s avatar nicoleepp avatar omus avatar one-more-fix avatar quildtide avatar quinnj avatar rofinn avatar rohitvarkey avatar svilupp avatar tanmaykm avatar tylerloewen avatar willtebbutt avatar wookay avatar wynand avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

libpq.jl's Issues

Non-numeric arrays support

Hi! Is it possible to automatically convert postgres arrays to julia arrays? E.g. when there is array_agg in the query.

Argument error when parsing date time

I am running a very simple Insert into a table. My insert query is as follows:

INSERT INTO damage.damages (rule_id, attrs) VALUES ($1, $2) RETURNING *;

I am running it as:

result = execute(
    conn, 
    "INSERT INTO damage.damages (rule_id, attrs) VALUES ($1, $2) RETURNING *", 
    [ruleId, attrs]
)

This works as expected. However when I run the subsequent:

data = fetch!(NamedTuple, result)

I get the following error message. I am not sure why this is the case.

ERROR: ArgumentError: Unable to parse date time. Expected directive Delim(\z) at char 24
Stacktrace:
 [1] macro expansion at ./dates/parse.jl:106 [inlined]
 [2] tryparsenext_core(::String, ::Int64, ::Int64, ::DateFormat{Symbol("y-m-d HH:MM:SS.sssz"),Tuple{Base.Dates.DatePart{'y'},Base.Dates.Delim{Char,1},Base.Dates.DatePart{'m'},Base.Dates.Delim{Char,1},Base.Dates.DatePart{'d'},Base.Dates.Delim{Char,1},Base.Dates.DatePart{'H'},Base.Dates.Delim{Char,1},Base.Dates.DatePart{'M'},Base.Dates.Delim{Char,1},Base.Dates.DatePart{'S'},Base.Dates.Delim{Char,1},Base.Dates.DatePart{'s'},Base.Dates.Delim{Char,1}}}, ::Bool) at ./dates/parse.jl:39
 [3] macro expansion at ./dates/parse.jl:153 [inlined]
 [4] tryparsenext_internal(::Type{TimeZones.ZonedDateTime}, ::String, ::Int64, ::Int64, ::DateFormat{Symbol("y-m-d HH:MM:SS.sssz"),Tuple{Base.Dates.DatePart{'y'},Base.Dates.Delim{Char,1},Base.Dates.DatePart{'m'},Base.Dates.Delim{Char,1},Base.Dates.DatePart{'d'},Base.Dates.Delim{Char,1},Base.Dates.DatePart{'H'},Base.Dates.Delim{Char,1},Base.Dates.DatePart{'M'},Base.Dates.Delim{Char,1},Base.Dates.DatePart{'S'},Base.Dates.Delim{Char,1},Base.Dates.DatePart{'s'},Base.Dates.Delim{Char,1}}}, ::Bool) at ./dates/parse.jl:129
 [5] parse(::Type{TimeZones.ZonedDateTime}, ::String, ::DateFormat{Symbol("y-m-d HH:MM:SS.sssz"),Tuple{Base.Dates.DatePart{'y'},Base.Dates.Delim{Char,1},Base.Dates.DatePart{'m'},Base.Dates.Delim{Char,1},Base.Dates.DatePart{'d'},Base.Dates.Delim{Char,1},Base.Dates.DatePart{'H'},Base.Dates.Delim{Char,1},Base.Dates.DatePart{'M'},Base.Dates.Delim{Char,1},Base.Dates.DatePart{'S'},Base.Dates.Delim{Char,1},Base.Dates.DatePart{'s'},Base.Dates.Delim{Char,1}}}) at ./dates/parse.jl:270
 [6] parse(::Type{TimeZones.ZonedDateTime}, ::LibPQ.PQValue{0x000004a0}) at /var/task/julia/v0.6/LibPQ/src/parsing.jl:256
 [7] (::LibPQ.#parse_type#40{DataType})(::LibPQ.PQValue{0x000004a0}) at /var/task/julia/v0.6/LibPQ/src/parsing.jl:355
 [8] streamfrom(::LibPQ.Result, ::Type{DataStreams.Data.Field}, ::Type{Union{Missings.Missing, TimeZones.ZonedDateTime}}, ::Int64, ::Int64) at /var/task/julia/v0.6/LibPQ/src/datastreams.jl:60
 [9] macro expansion at /var/task/julia/v0.6/DataStreams/src/query.jl:484 [inlined]
 [10] stream!(::LibPQ.Result, ::DataStreams.Data.Query{0x01,Tuple{DataStreams.Data.QueryColumn{0x01,Union{Int32, Missings.Missing},1,1,:id,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missings.Missing, String},2,2,:global_id,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missings.Missing, String},3,3,:damage_id,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Int32, Missings.Missing},4,4,:rule_id,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missings.Missing, String},5,5,:attrs,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missings.Missing, TimeZones.ZonedDateTime},6,6,:insert_timestamp,nothing,()}},(),nothing,nothing}, ::Type{DataStreams.Data.Field}, ::NamedTuples._NT_id_global__id_damage__id_rule__id_attrs_insert__timestamp{Array{Union{Int32, Missings.Missing},1},Array{Union{Missings.Missing, String},1},Array{Union{Missings.Missing, String},1},Array{Union{Int32, Missings.Missing},1},Array{Union{Missings.Missing, String},1},Array{Union{Missings.Missing, TimeZones.ZonedDateTime},1}}, ::DataStreams.Data.Schema{true,Tuple{Union{Int32, Missings.Missing},Union{Missings.Missing, String},Union{Missings.Missing, String},Union{Int32, Missings.Missing},Union{Missings.Missing, String},Union{Missings.Missing, TimeZones.ZonedDateTime}}}, ::Int64) at /var/task/julia/v0.6/DataStreams/src/query.jl:628
 [11] #stream!#122(::Bool, ::Array{Any,1}, ::Function, ::LibPQ.Result, ::DataStreams.Data.Query{0x01,Tuple{DataStreams.Data.QueryColumn{0x01,Union{Int32, Missings.Missing},1,1,:id,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missings.Missing, String},2,2,:global_id,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missings.Missing, String},3,3,:damage_id,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Int32, Missings.Missing},4,4,:rule_id,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missings.Missing, String},5,5,:attrs,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missings.Missing, TimeZones.ZonedDateTime},6,6,:insert_timestamp,nothing,()}},(),nothing,nothing}, ::Type{NamedTuples.NamedTuple}) at /var/task/julia/v0.6/DataStreams/src/query.jl:598
 [12] (::DataStreams.Data.#kw##stream!)(::Array{Any,1}, ::DataStreams.Data.#stream!, ::LibPQ.Result, ::DataStreams.Data.Query{0x01,Tuple{DataStreams.Data.QueryColumn{0x01,Union{Int32, Missings.Missing},1,1,:id,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missings.Missing, String},2,2,:global_id,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missings.Missing, String},3,3,:damage_id,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Int32, Missings.Missing},4,4,:rule_id,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missings.Missing, String},5,5,:attrs,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missings.Missing, TimeZones.ZonedDateTime},6,6,:insert_timestamp,nothing,()}},(),nothing,nothing}, ::Type{NamedTuples.NamedTuple}) at ./<missing>:0
 [13] #stream!#120(::Bool, ::Dict{Int64,Function}, ::Function, ::Array{Any,1}, ::Array{Any,1}, ::Void, ::Void, ::Array{Any,1}, ::DataStreams.Data.#stream!, ::LibPQ.Result, ::Type{NamedTuples.NamedTuple}) at /var/task/julia/v0.6/DataStreams/src/query.jl:548
 [14] stream!(::LibPQ.Result, ::Type{NamedTuples.NamedTuple}) at /var/task/julia/v0.6/DataStreams/src/query.jl:526
 [15] #fetch!#43(::Array{Any,1}, ::Function, ::Type{T} where T, ::LibPQ.Result) at /var/task/julia/v0.6/LibPQ/src/datastreams.jl:138
 [16] runQuery(::LibPQ.Connection, ::String, ::Vararg{Any,N} where N) at /tmp/tmpZS6Hr3/julia/ssql/PostgreSql.jl:16
 [17] (::DamageDb.#make_damage#4{Dict{Symbol,Any}})(::LibPQ.Connection) at /tmp/tmpZS6Hr3/julia/ssql/DamageDb.jl:25
 [18] transact(::LibPQ.Connection, ::DamageDb.#make_damage#4{Dict{Symbol,Any}}) at /tmp/tmpZS6Hr3/julia/ssql/PostgreSql.jl:23
 [19] #new_damage#1(::Array{Any,1}, ::Function) at /tmp/tmpZS6Hr3/julia/ssql/DamageDb.jl:63
 [20] (::DamageDb.#kw##new_damage)(::Array{Any,1}, ::DamageDb.#new_damage) at ./<missing>:0
 [21] evalRequest(::Array{Dict{String,Any},1}, ::Dict{String,Any}) at /tmp/tmpZS6Hr3/julia/ssql/Engine.jl:61
 [22] lambda_function_with_event(::Dict{String,Any}) at /tmp/tmpZS6Hr3/module_ssql_request.jl:12
 [23] invoke_lambda(::Module, ::Dict{String,Any}, ::IOStream) at /var/task/julia/v0.6/AWSLambdaWrapper.jl:35
 [24] open(::AWSLambdaWrapper.##1#2{Module}, ::String, ::String) at ./iostream.jl:152
 [25] main(::Module) at /var/task/julia/v0.6/AWSLambdaWrapper.jl:50
 [26] process_options(::Base.JLOptions) at ./client.jl:286
 [27] _start() at ./client.jl:371

The table is fairly simple as well. Its schema is as follows:

CREATE TABLE IF NOT EXISTS damage.damages (                                                                                    
  rule_id integer NOT NULL,                                              
  attrs jsonb,                                                                                          
  insert_timestamp timestamptz default now()                                                         
); 

It seems to have to do with the insert_timestamp field but I'm not sure why it's having a hard time parsing the data.

documentation or more tools for dealing with logger output

Currently if you upload an entire table with Data.stream! it spits out a number of "info" statements equal to the number of rows. It would be nice if this could be mitigated somehow. Glancing through the code, it looks like there may already be a nice way of doing this, but without documentation I'll have to dig through it a good deal more to figure out how.

Is this indeed something that's already supported?

Remove DataStreams support

Since the library itself is deprecated, we should remove support for it here.

Tables.jl should be an adequate replacement, and that is already implemented.

Doing this will allow an AbstractResult type which may be useful for #84

Incompatability with DataFrames v0.18.1

Hi, thanks for your nice package.
Since the release of DataFrames v0.18.1 the following does not work anymore:

using LibPQ, DataFrames
conn = LibPQ.Connection("postgresql://**correct con string**")
data = fetch!(DataFrame, execute(conn, "SELECT 1"))

The error message is:

MethodError: no method matching streamtypes(::Type{DataFrame})

my versioninfo is

Julia Version 1.1.0
Commit 80516ca202 (2019-01-21 21:24 UTC)
Platform Info:
  OS: Windows (x86_64-w64-mingw32)
  CPU: Intel(R) Core(TM) i7-8550U CPU @ 1.80GHz
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-6.0.1 (ORCJIT, skylake)
Environment:
  JULIA_NUM_THREADS = 4

Issue with initialization

Hi, first of all, thank you for putting this package together! I think your most recent update might have some sort of conflict with DataStreams. I get this error upon importing the package (e.g. using LibPQ):

ERROR: LoadError: LoadError: UndefVarError: accesspattern not defined
Stacktrace:
 [1] include_from_node1(::String) at ./loading.jl:576
 [2] include(::String) at ./sysimg.jl:14
 [3] include_from_node1(::String) at ./loading.jl:576
 [4] eval(::Module, ::Any) at ./boot.jl:235
 [5] _require(::Symbol) at ./loading.jl:490
 [6] require(::Symbol) at ./loading.jl:405
while loading /Users/christopheralexander/.julia/v0.6/LibPQ/src/datastreams.jl, in expression starting on line 27
while loading /Users/christopheralexander/.julia/v0.6/LibPQ/src/LibPQ.jl, in expression starting on line 571

I'm not sure if that is exported? Or perhaps not with the most recent tagged version of DataStreams.

Thanks!

Ambiguous parse method

MethodError: parse(::Type{Int32}, ::LibPQ.PQValue{0x00000017}) is ambiguous. Candidates:
  parse(T::Type{#s3118} where #s3118<:Integer, s) in YAML at /root/.julia/v0.6/YAML/src/YAML.jl:23
  parse(::Type{T}, pqv::LibPQ.PQValue) where T in LibPQ at /root/.julia/v0.6/LibPQ/src/parsing.jl:110
Possible fix, define
  parse(::Type{T<:Integer}, ::LibPQ.PQValue)
(::LibPQ.#parse_type#40{DataType})(::LibPQ.PQValue{0x00000017}) at /root/.julia/v0.6/LibPQ/src/parsing.jl:353
streamfrom(::LibPQ.Result, ::Type{DataStreams.Data.Field}, ::Type{Int32}, ::Int64, ::Int64) at /root/.julia/v0.6/LibPQ/src/datastreams.jl:77
macro expansion at /root/.julia/v0.6/DataStreams/src/DataStreams.jl:542 [inlined]
stream!(::LibPQ.Result, ::Type{DataStreams.Data.Field}, ::DataFrames.DataFrameStream{Tuple{Array{Int32,1},Array{String,1},Array{Bool,1}}}, ::DataStreams.Data.Schema{true,Tuple{Int32,String,Bool}}, ::Int64, ::Tuple{Base.#identity,Base.#identity,Base.#identity}, ::DataStreams.Data.##15#16, ::Array{Any,1}, ::Type{Ref{(:node_id, :name, :tradable)}}) at /root/.julia/v0.6/DataStreams/src/DataStreams.jl:614
#stream!#17(::Bool, ::Dict{Int64,Function}, ::Function, ::Array{Any,1}, ::Array{Any,1}, ::Function, ::LibPQ.Result, ::Type{DataFrames.DataFrame}) at /root/.julia/v0.6/DataStreams/src/DataStreams.jl:490
stream!(::LibPQ.Result, ::Type{DataFrames.DataFrame}) at /root/.julia/v0.6/DataStreams/src/DataStreams.jl:475
#fetch!#43(::Array{Any,1}, ::Function, ::Type{T} where T, ::LibPQ.Result) at /root/.julia/v0.6/LibPQ/src/datastreams.jl:138
...

Build error in Windows 10

Hi,

I am trying to using LibPQ on Windows 10 and I'm getting the following error when building:

pkg> build LibPQ
  Building EzXML ────→ `C:\Users\nicjk\.julia\packages\EzXML\DUxj7\deps\build.log`
  Building TimeZones → `C:\Users\nicjk\.julia\packages\TimeZones\WMDpl\deps\build.log`
  Building LibPQ ────→ `C:\Users\nicjk\.julia\packages\LibPQ\LODsS\deps\build.log`
┌ Error: Error building `LibPQ`:
│ ┌ Warning: On Windows, creating file symlinks requires Administrator privileges
│ └ @ Base.Filesystem file.jl:794
│ ┌ Warning: platform_key() is deprecated, use platform_key_abi() from now on
│ │   caller = ip:0x0
│ └ @ Core :-1
│ ┌ Warning: Could not extract the platform key of https://github.com/invenia/LibPQBuilder/releases/download/v10.3-1-4/libpq.x86_64-w64-mingw32.tar.gz; continuing...
│ └ @ BinaryProvider C:\Users\nicjk\.julia\packages\BinaryProvider\4F5Hq\src\Prefix.jl:185
│ [ Info: Destination file C:\Users\nicjk\.julia\packages\LibPQ\LODsS\deps\usr\downloads\libpq.x86_64-w64-mingw32.tar.gz already exists, verifying...
│ [ Info: Hash cache is consistent, returning true
│ [ Info: Installing C:\Users\nicjk\.julia\packages\LibPQ\LODsS\deps\usr\downloads\libpq.x86_64-w64-mingw32.tar.gz into C:\Users\nicjk\.julia\packages\LibPQ\LODsS\deps\usr
│ [ Info: Removing files installed by manifests\libpq.x86_64-w64-mingw32.list
│ [ Info:   lib\libpq.dll removed
│ [ Info:   lib\libpq.lib removed
│ [ Info:   lib\libeay32.lib removed
│ [ Info:   lib\ssleay32.lib removed
│ [ Info:   lib\libintl.lib removed
│ [ Info:   Culling empty directory lib
│ [ Info:   bin\libpq.dll removed
│ [ Info:   bin\libeay32.dll removed
│ [ Info:   bin\ssleay32.dll removed
│ [ Info:   bin\libintl-8.dll removed
│ [ Info:   bin\libiconv-2.dll removed
│ [ Info:   Culling empty directory bin
│ [ Info:   manifests\libpq.x86_64-w64-mingw32.list removed
│ ERROR: LoadError: LibraryProduct(nothing, ["libpq"], :LIBPQ_HANDLE, "Prefix(C:\\Users\\nicjk\\.julia\\packages\\LibPQ\\LODsS\\deps\\usr)") is not satisfied, cannot generate deps.jl!
│ Stacktrace:
│  [1] error(::String) at .\error.jl:33
│  [2] #write_deps_file#156(::Bool, ::Function, ::String, ::Array{Product,1}) at C:\Users\nicjk\.julia\packages\BinaryProvider\4F5Hq\src\Products.jl:414
│  [3] write_deps_file(::String, ::Array{Product,1}) at C:\Users\nicjk\.julia\packages\BinaryProvider\4F5Hq\src\Products.jl:395
│  [4] top-level scope at C:\Users\nicjk\.julia\packages\LibPQ\LODsS\deps\build.jl:34
│  [5] include at .\boot.jl:326 [inlined]
│  [6] include_relative(::Module, ::String) at .\loading.jl:1038
│  [7] include(::Module, ::String) at .\sysimg.jl:29
│  [8] include(::String) at .\client.jl:403
│  [9] top-level scope at none:0
│ in expression starting at C:\Users\nicjk\.julia\packages\LibPQ\LODsS\deps\build.jl:24
│
│ 7-Zip 18.05 (x64) : Copyright (c) 1999-2018 Igor Pavlov : 2018-04-30
│
│
│ Extracting archive:
│ --
│ Path =
│ Type = tar
│ Code Page = UTF-8
│
│ Everything is Ok
│
│ Folders: 3
│ Files: 10
│ Size:       5890457
│ Compressed: 16384
└ @ Pkg.Operations C:\cygwin\home\Administrator\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.1\Pkg\src\Operations.jl:1075

My version info is:

Julia Version 1.1.0
Commit 80516ca202 (2019-01-21 21:24 UTC)
Platform Info:
  OS: Windows (x86_64-w64-mingw32)
  CPU: Intel(R) Core(TM) i7-3770 CPU @ 3.40GHz
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-6.0.1 (ORCJIT, ivybridge)

Any help would be greatly appreciated -- thanks!

How can i execute prepared statement with sql function

Table DDL:

CREATE TABLE heres(
  id serial,
  name varchar,
  lnglat geometry("POINT", 4326),
  primary key (id)
);

LibPQ.execute(conn, "INSERT INTO heres (name, lnglat) VALUES ('test2',ST_GeomFromEWKT('SRID=4326;POINT(12 34)'))")
or
LibPQ.execute(conn, "INSERT INTO heres (name, lnglat) VALUES ('test2','0101000020e61000000000000000000000000000000000f03f')")
are ok. but when i try to execute prepared statement with sql function as follows:

prepare_sql = "INSERT INTO heres (name, lnglat) VALUES (\$1, \$2)"
stmt = LibPQ.prepare(conn, prepare_sql)
LibPQ.execute(stmt, ["test", ST_GeomFromEWKT("SRID=4326;POINT(12 34)")])

I got error:

ERROR: ERROR:  parse error - invalid geometry
HINT:  "SQ" <-- parse error at position 2 within geometry

How can i execute prepared statement with sql function?

Conversion of dates

Hi,

When selecting the data timestamps are properly converted to Base.Dates.DateTime but dates are fetched as strings. Is there any reason not to convert it to Base.Dates.Date?

result = LibPQ.execute(connection, "SELECT d, dt FROM public.test;")
data = DataStreams.Data.stream!(result, DataStreams.Data.Table)
dump(data)
_NT_d_dt{Array{Union{Missings.Missing, String},1},Array{Union{DateTime, Missings.Missing},1}}
  d: Array{Union{Missings.Missing, String}}((1,))
    1: String "2018-01-01"
  dt: Array{Union{DateTime, Missings.Missing}}((1,))
    1: DateTime
      instant: Base.Dates.UTInstant{Base.Dates.Millisecond}
        periods: Base.Dates.Millisecond
          value: Int64 63650491200000

Thanks,
Piotr

Automatically clear results

Would it be useful to write a finalizer for the Result type where it would call clear when the result goes out of scope and has not already been cleared?

Unable to connect to Redshift

Trying to connect to Redshift fails with ERROR: FATAL: unrecognized configuration parameter "IntervalStyle" from /.julia/packages/LibPQ/fqxvQ/src/LibPQ.jl:178. AFAIK it should be possible to use libpq with Redshift.

`Tables.schema` returns a bad type

How it should look:

Tables.Schema{(:A, :B),Tuple{Union{Missing, Int64},Float64}}

how it looks now:

Tables.Schema{(Symbol("Union{Missing, Int64}"), :Float64), Tuple{:A, :B}}

Add Memento logging

Logging using Memento.jl will allow suppressing log messages below a certain level, paving the way for adding verbose debug log messages.

Add a password prompt

Use PQconnectionNeedsPassword and PQconnectionUsedPassword to prompt for a password in appropriate cases. Use securezero! to wipe the password data once used.

Statements and DataStreams Sink

It looks like prepared statements are how literally everyone does things so that's how I will as well. And the Statement type should serve as a nice Sink for DataStreams.

MethodError when inserting password in the Repl to connect to database

When trying to connect to a database with the command conn = LibPQ.Connection("dbname=mydb") I am asked to enter my user password. This fails with the following error
ERROR: MethodError: Cannot convert an object of type Base.SecretBuffer to an object of type String

I can of course connect to the database with the command conn = LibPQ.Connection("dbname=mydb password=mypassword") but it would be nice to be able to interactively enter the password from the Repl.

Alberto

Parsing array of strings?

I would like to use queries that return arrays of text, however parsing these columns into arrays does not currently work: I just get the string representing the PostgreSQL array instead. I also don't know how to extend the current array parsing code to work with arrays of strings, because there is no entry in _DEFAULT_TYPE_MAP for :text. I am currently trying to figure out how columns of text are normally converted, so I can make this work. Any pointers?

Heavy memory load when doing a large number of transactions using prepared statements

I'm doing approximately 2^20 transactions, each writing about 20 table rows, using a set of four small prepared statements. I'm doing this in parallel using four threads, on separate connections. For some reason, the memory load (in terms of RAM use) keeps climbing and climbing, even though I ensure that I commit all transactions as soon as I can. In fact, I've almost exhausted all 16G of RAM available on my machine trying this several times.

Is there some kind of hidden memory use this library has when using prepared statements or transactions, or is there a memory leak?

Allow tests without requiring Unix sockets

Usually when I am interacting with a local Postgres server I end up use a Docker container. It would be nice if you allowed an environmental variable to specify the connection string such that I could supply the port number of my database server.

For example if I wanted to run the tests using a PostgreSQL container I could run the following to get a working PostgreSQL running on localhost:5555 using the user "postgres" with no password:

docker run --name libpq-test -p 5555:5432 -d postgres

MethodError: no method matching isdone

I get errors like this when trying to insert values.

MethodError: no method matching isdone(::NamedTuple{(:symbol, :col1, :col2, :col3),Tuple{String,Float64,Float64,Float64}}, ::Int64, ::Int64)
You may have intended to import Base.isdone

And in the example doc page at https://github.com/invenia/LibPQ.jl/blob/master/docs/src/index.md , using NamedTuples would result error because it has been in Base. Maybe it could be clearer if you update the doc and provide an example of data to insert at the Insertion section.

LibPQ failing to precompile - Julia v1.0

Running on Ubuntu 16.04 LTS.
I installed LibPQ using (v1.0) pkg> add LibPQ#master as I was previously getting an error when I tried installing via Pkg.add("LibPQ").

julia> using LibPQ
[ Info: Precompiling LibPQ [194296ae-ab2e-5f79-8cd4-7183a0a5a0d1]
ERROR: LoadError: LoadError: UndefVarError: start not defined
Stacktrace:
 [1] getproperty(::Module, ::Symbol) at ./sysimg.jl:13
 [2] top-level scope at none:0
 [3] include at ./boot.jl:317 [inlined]
 [4] include_relative(::Module, ::String) at ./loading.jl:1038
 [5] include at ./sysimg.jl:29 [inlined]
 [6] include(::String) at /home/mdsalerno/.julia/packages/LibPQ/oOJbp/src/LibPQ.jl:1
 [7] top-level scope at none:0
 [8] include at ./boot.jl:317 [inlined]
 [9] include_relative(::Module, ::String) at ./loading.jl:1038
 [10] include(::Module, ::String) at ./sysimg.jl:29
 [11] top-level scope at none:2
 [12] eval at ./boot.jl:319 [inlined]
 [13] eval(::Expr) at ./client.jl:389
 [14] top-level scope at ./none:3
in expression starting at /home/mdsalerno/.julia/packages/LibPQ/oOJbp/src/typemaps.jl:189
in expression starting at /home/mdsalerno/.julia/packages/LibPQ/oOJbp/src/LibPQ.jl:58
ERROR: Failed to precompile LibPQ [194296ae-ab2e-5f79-8cd4-7183a0a5a0d1] to /home/mdsalerno/.julia/compiled/v1.0/LibPQ/LeQQU.ji.
Stacktrace:
 [1] error(::String) at ./error.jl:33
 [2] macro expansion at ./logging.jl:313 [inlined]
 [3] compilecache(::Base.PkgId, ::String) at ./loading.jl:1184
 [4] _require(::Base.PkgId) at ./logging.jl:311
 [5] require(::Base.PkgId) at ./loading.jl:852
 [6] macro expansion at ./logging.jl:311 [inlined]
 [7] require(::Module, ::Symbol) at ./loading.jl:834

Inexact Error at timestamp parsing

PostgreSQL uses timestamps with microsecond resolution, while Julia DateTime supports only milliseconds. Having a connection conn, the following works:

fetch!(NamedTuple, execute(conn, "select '2001-01-01 12:12:12.123'::timestamp"))

while

fetch!(NamedTuple, execute(conn, "select '2001-01-01 12:12:12.1234'::timestamp"))

fails with an "Inexact Error" in parsing.jl in Base.parse(::Type{DateTime}, pqv::PQValue{PQ_SYSTEM_TYPES[:timestamp]}). Would it be ok for now to just cut off the last digits of the timestamp before parsing?

can not install on julia 0.4.5

julia> dlopen(:libpq)
WARNING: dlopen is deprecated, use Libdl.dlopen instead.
in depwarn at ./deprecated.jl:73
while loading no file, in expression starting on line 0
Ptr{Void} @0x00000000038c2160

julia> Pkg.build("LibPQ")
ERROR: LibPQ is not an installed package
in build! at ./pkg/entry.jl:654
in build! at ./pkg/entry.jl:698

First example in documentation throws UndefVarError

I tried to do this example (modified with placeholder data instead of mine or the example's):

using LibPQ
conn = LibPQ.Connection("dbname=foobar user=baz password=quux")
result = execute(conn, "SELECT * FROM my_table;")
data = Data.stream!(result, NamedTuple)
clear!(result)

However, when I got to the Data.stream!, I get the following error message:

ERROR: UndefVarError: Data not defined
Stacktrace:
 [1] macro expansion at ./REPL.jl:97 [inlined]
 [2] (::Base.REPL.##1#2{Base.REPL.REPLBackend})() at ./event.jl:73

Did I forget to use something? If so, it'd be good if the examples listed what else I needed to be using for them to work.

Cannot install on Julia 1.1.0

After updating to Julia 1.1.0, I cannot install LibPQ anymore.

(v1.1) pkg> add LibPQ                                                                                                                                                                                [52/84]
   Cloning default registries into `~/.julia`
   Cloning registry from "https://github.com/JuliaRegistries/General.git"
     Added registry `General` to `~/.julia/registries/General`
 Resolving package versions...
 Installed LayerDicts ────────── v0.1.2
 Installed WeakRefStrings ────── v0.5.4
 Installed Decimals ──────────── v0.3.1
 Installed Missings ──────────── v0.4.0
 Installed DataStreams ───────── v0.4.1
 Installed IterTools ─────────── v1.1.1
 Installed BinaryProvider ────── v0.5.3
 Installed LibPQ ─────────────── v0.6.1
 Installed OffsetArrays ──────── v0.9.1
 Installed Mocking ───────────── v0.5.7
 Installed EzXML ─────────────── v0.9.0
 Installed Memento ───────────── v0.10.0
 Installed JSON ──────────────── v0.20.0
 Installed Compat ────────────── v1.4.0
 Installed TimeZones ─────────── v0.8.4
 Installed Syslogs ───────────── v0.2.0
 Installed Nullables ─────────── v0.0.8
 Installed DocStringExtensions ─ v0.6.0
  Updating `~/.julia/environments/v1.1/Project.toml`
  [194296ae] + LibPQ v0.6.1
  Updating `~/.julia/environments/v1.1/Manifest.toml`
  [b99e7846] + BinaryProvider v0.5.3
  [34da2185] + Compat v1.4.0
  [9a8bc11e] + DataStreams v0.4.1
  [abce61dc] + Decimals v0.3.1
  [ffbed154] + DocStringExtensions v0.6.0
  [8f5d6c58] + EzXML v0.9.0
  [c8e1da08] + IterTools v1.1.1
  [682c06a0] + JSON v0.20.0
  [6f188dcb] + LayerDicts v0.1.2
  [194296ae] + LibPQ v0.6.1
  [f28f55f0] + Memento v0.10.0
  [e1d29d7a] + Missings v0.4.0
  [78c3b35d] + Mocking v0.5.7
  [4d1e1d77] + Nullables v0.0.8
  [6fe1bfb0] + OffsetArrays v0.9.1
  [cea106d9] + Syslogs v0.2.0
  [f269a46b] + TimeZones v0.8.4
  [ea10d353] + WeakRefStrings v0.5.4
  [2a0f44e3] + Base64
  [ade2ca70] + Dates
  [8bb1440f] + DelimitedFiles
  [8ba89e20] + Distributed
  [b77e0a4c] + InteractiveUtils
  [76f85450] + LibGit2                                                                                                                                                             
  [8f399da3] + Libdl
  [37e2e46d] + LinearAlgebra
  [56ddb016] + Logging
  [d6f4376e] + Markdown
  [a63ad114] + Mmap
  [44cfe95a] + Pkg
  [de0858da] + Printf
  [3fa0cd96] + REPL
  [9a3f8284] + Random
  [ea8e919c] + SHA
  [9e88b42a] + Serialization
  [1a1011a3] + SharedArrays
  [6462fe0b] + Sockets
  [2f01184e] + SparseArrays
  [10745b16] + Statistics
  [8dfed614] + Test
  [cf7118a7] + UUIDs
  [4ec0a83e] + Unicode
  Building EzXML ──── `~/.julia/packages/EzXML/DUxj7/deps/build.log`
  Building TimeZones  `~/.julia/packages/TimeZones/dubpG/deps/build.log`
┌ Error: Error building `TimeZones`:
│   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
│                                  Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:--  0:00:01 --:--:--     0
│ curl: (35) error:1425F175:SSL routines:ssl_choose_client_version:inappropriate fallback
│ [ Info: Downloading 2018i tzdata
│ ERROR: LoadError: failed process: Process(`curl -g -L -f -o /Users/guillaume/.julia/packages/TimeZones/dubpG/deps/tzarchive/tzdata2018i.tar.gz https://www.iana.org/time-zones/repository/releases/tzdata2018i.tar.gz`, ProcessExited(35)) [35]
│ Stacktrace:
│  [1] error(::String, ::Base.Process, ::String, ::Int64, ::String) at ./error.jl:42
│  [2] pipeline_error at ./process.jl:785 [inlined]
│  [3] #run#515(::Bool, ::Function, ::Cmd) at ./process.jl:726
│  [4] run at ./process.jl:724 [inlined]
│  [5] download(::String, ::String) at ./download.jl:27
│  [6] tzdata_download(::String, ::String) at /Users/guillaume/.julia/packages/TimeZones/dubpG/src/tzdata/download.jl:90
│  [7] #build#32(::Bool, ::Function, ::String, ::Array{String,1}, ::String, ::String, ::String) at /Users/guillaume/.julia/packages/TimeZones/dubpG/src/tzdata/build.jl:37
│  [8] #build at ./tuple.jl:0 [inlined]
│  [9] build(::String, ::Array{String,1}) at /Users/guillaume/.julia/packages/TimeZones/dubpG/src/tzdata/build.jl:72
│  [10] #build#5(::Bool, ::Function, ::String, ::Array{String,1}) at /Users/guillaume/.julia/packages/TimeZones/dubpG/src/TimeZones.jl:116
│  [11] build at /Users/guillaume/.julia/packages/TimeZones/dubpG/src/TimeZones.jl:116 [inlined] (repeats 2 times)
│  [12] top-level scope at none:0
│  [13] include at ./boot.jl:326 [inlined]
│  [14] include_relative(::Module, ::String) at ./loading.jl:1038
│  [15] include(::Module, ::String) at ./sysimg.jl:29
│  [16] include(::String) at ./client.jl:403
│  [17] top-level scope at none:0in expression starting at /Users/guillaume/.julia/packages/TimeZones/dubpG/deps/build.jl:6
└ @ Pkg.Operations /Users/osx/buildbot/slave/package_osx64/build/usr/share/julia/stdlib/v1.1/Pkg/src/Operations.jl:1075
  Building LibPQ ──── `~/.julia/packages/LibPQ/ABdaZ/deps/build.log`

then

julia> using LibPQ
[ Info: Precompiling LibPQ [194296ae-ab2e-5f79-8cd4-7183a0a5a0d1]
ERROR: LoadError: InitError: ArgumentError: Unknown time zone "Europe/Paris"
Stacktrace:
 [1] (::getfield(TimeZones, Symbol("##1#3")){SubString{String}})() at /Users/guillaume/.julia/packages/TimeZones/dubpG/src/TimeZones.jl:70
 [2] get!(::getfield(TimeZones, Symbol("##1#3")){SubString{String}}, ::Dict{AbstractString,Dates.TimeZone}, ::SubString{String}) at ./dict.jl:453
 [3] Type at /Users/guillaume/.julia/packages/TimeZones/dubpG/src/TimeZones.jl:64 [inlined]
 [4] localzone() at /Users/guillaume/.julia/packages/TimeZones/dubpG/src/local.jl:25
 [5] Type at /Users/guillaume/.julia/packages/Memento/QMKyB/src/formatters.jl:35 [inlined]
 [6] #config!#71(::String, ::Dict{AbstractString,Int64}, ::Bool, ::Bool, ::Bool, ::Bool, ::Function, ::Memento.Logger, ::String) at /Users/guillaume/.julia/packages/Memento/QMKyB/src/config.jl:39
 [7] #config!#69 at /Users/guillaume/.julia/packages/Memento/QMKyB/src/config.jl:36 [inlined]
 [8] config! at /Users/guillaume/.julia/packages/Memento/QMKyB/src/config.jl:25 [inlined]
 [9] __init__() at /Users/guillaume/.julia/packages/Memento/QMKyB/src/Memento.jl:67
 [10] _include_from_serialized(::String, ::Array{Any,1}) at ./loading.jl:633
 [11] _require_from_serialized(::String) at ./loading.jl:684
 [12] _require(::Base.PkgId) at ./loading.jl:967
 [13] require(::Base.PkgId) at ./loading.jl:858
 [14] require(::Module, ::Symbol) at ./loading.jl:853
 [15] include at ./boot.jl:326 [inlined]
 [16] include_relative(::Module, ::String) at ./loading.jl:1038
 [17] include(::Module, ::String) at ./sysimg.jl:29
 [18] top-level scope at none:2
 [19] eval at ./boot.jl:328 [inlined]
 [20] eval(::Expr) at ./client.jl:404
 [21] top-level scope at ./none:3
during initialization of module Memento
in expression starting at /Users/guillaume/.julia/packages/LibPQ/ABdaZ/src/LibPQ.jl:15
ERROR: Failed to precompile LibPQ [194296ae-ab2e-5f79-8cd4-7183a0a5a0d1] to /Users/guillaume/.julia/compiled/v1.1/LibPQ/LeQQU.ji.
Stacktrace:
 [1] error(::String) at ./error.jl:33
 [2] compilecache(::Base.PkgId, ::String) at ./loading.jl:1197
 [3] _require(::Base.PkgId) at ./loading.jl:960
 [4] require(::Base.PkgId) at ./loading.jl:858
 [5] require(::Module, ::Symbol) at ./loading.jl:853

It seems to be a TimeZones related problem, but what do I know!

rstrip pgvalue throwing error for fetch

I have a basic julia setup. I can successfully fetch data from table1 but when executing the below command for table2, I get the following error:

data = fetch!(DataFrame, execute(conn, "SELECT * FROM table2 limit 5"))

ERROR: MethodError: no method matching rstrip(::LibPQ.PQValue{0x00000412}, ::Char)

Is this something I'm doing wrong or an issue with libPQ

Execution does not terminate when fetching LibPQ.result to a DataFrame

Following the example provided in the documentation, I was able to successfully run conn = LibPQ.Connection(connection_string) and result = execute(conn, query_string), however, when I try to fetch the result into a DataFrame via data = Data.stream!(result, DataFrame), the execution hangs indefinitely and I am forced to manually interrupt the program. I have the same issue regardless of how small/large the query result is.

a less painful solution for doing a bulk insert

Currently it is recommended that inserts are done with Data.stream!. The problem with this is that right now this performs a row-by-row insert which is unbelievably slow (for me it was at least 10 minutes for 5e4 rows with a perfectly good connection). It seems that libpq itself does not provide us with many good solutions.

One suggestion, thanks to Ibilli on discourse is to write a DataFrame to a buffer in the form of a CSV and upload it with a COPY statement. While horrific, this might be the best option as it might otherwise be completely impractical to do a large insert.

Now, I realize that the posted solution involves both DataFrames and CSV and that these will not be added as dependencies. My intention in opening this issue was to discuss:

  1. Should some function be added to LibPQ to make implementing this easier? Perhaps something like bulkcopy(::IO) and bulkcopy(::String)? (As far as I can tell there's nothing like that here now.)
  2. Is there a better solution lurking around that I'm just not seeing? If so I'd be glad to make a PR to document it.

Thanks all.

Document TimeZone

The TimeZone connection parameter is pretty important for working with timestamptz and friends, but it's undocumented.

COPY sometimes fails completely silently

Following the example to do COPY for inserts. Sometimes this fails completely silently. Working on a MWE, but of course this is extremely difficult because SQL.

Is there any way to get debug messages?

Check status on every operation

Allowing functions to use Connections which may be broken or closed is dangerous.

For now I'll just add a function call to every function that calls libpq_c functions on Ptr{PGconn} directly, but it might be a good idea to have a function-wrapping macro that handles that in the future.

UndefVarError:INference not defined on Julia 0.7-alpha

When I try "fetch" I get this error:

WARNING: Base.Associative is deprecated, use AbstractDict instead.
 in module LayerDicts
WARNING: Base.Associative is deprecated, use AbstractDict instead.
 in module LayerDicts
WARNING: Base.Associative is deprecated, use AbstractDict instead.
 in module LayerDicts
WARNING: Base.Associative is deprecated, use AbstractDict instead.
 in module LayerDicts
WARNING: Base.iteratorsize is deprecated, use IteratorSize instead.
 in module IterTools
WARNING: Base.iteratorsize is deprecated, use IteratorSize instead.
 in module IterTools
WARNING: Base.iteratorsize is deprecated, use IteratorSize instead.
 in module IterTools
WARNING: Base.iteratorsize is deprecated, use IteratorSize instead.
 in module IterTools
WARNING: Base.iteratorsize is deprecated, use IteratorSize instead.
 in module IterTools
WARNING: Base.iteratorsize is deprecated, use IteratorSize instead.
 in module IterTools
WARNING: Base.iteratorsize is deprecated, use IteratorSize instead.
 in module IterTools
WARNING: Base.iteratorsize is deprecated, use IteratorSize instead.
 in module IterTools
WARNING: Base.iteratorsize is deprecated, use IteratorSize instead.
 in module IterTools
WARNING: Base.iteratorsize is deprecated, use IteratorSize instead.
 in module IterTools
WARNING: Base.Associative is deprecated, use AbstractDict instead.
 in module LayerDicts
WARNING: Base.Associative is deprecated, use AbstractDict instead.
 in module LayerDicts
WARNING: Base.Associative is deprecated, use AbstractDict instead.
 in module LayerDicts
in Type at /home/js/.julia/packages/LayerDicts/nxOp/src/LayerDicts.jl
WARNING: Base.Associative is deprecated, use AbstractDict instead.
 in module LayerDicts
WARNING: Base.Associative is deprecated, use AbstractDict instead.
 in module LayerDicts
WARNING: Base.Associative is deprecated, use AbstractDict instead.
 in module LayerDicts
in Type at /home/js/.julia/packages/LayerDicts/nxOp/src/LayerDicts.jl
WARNING: importing deprecated binding Base.clear! into LibPQ.
WARNING: Base.clear! is deprecated: it has been moved to the standard library package `Distributed`.
Add `using Distributed` to your imports.
 in module LibPQ
WARNING: Base.clear! is deprecated: it has been moved to the standard library package `Distributed`.
Add `using Distributed` to your imports.
 in module LibPQ
WARNING: Base.clear! is deprecated: it has been moved to the standard library package `Distributed`.
Add `using Distributed` to your imports.
 in module LibPQ
WARNING: Base.clear! is deprecated: it has been moved to the standard library package `Distributed`.
Add `using Distributed` to your imports.
 in module LibPQ
┌ Warning: `datatype_module(t::DataType)` is deprecated, use `parentmodule(t)` instead.
│   caller = datatype at DataStreams.jl:464 [inlined]
└ @ Core DataStreams.jl:464
┌ Warning: `datatype_name(t::DataType)` is deprecated, use `nameof(t)` instead.
│   caller = datatype at DataStreams.jl:464 [inlined]
└ @ Core DataStreams.jl:464
┌ Warning: `eval(m, x)` is deprecated, use `Core.eval(m, x)` instead.
│   caller = datatype at DataStreams.jl:464 [inlined]
└ @ Core DataStreams.jl:464
┌ Warning: `datatype_module(t::DataType)` is deprecated, use `parentmodule(t)` instead.
│   caller = datatype at DataStreams.jl:464 [inlined]
└ @ Core DataStreams.jl:464
┌ Warning: `datatype_name(t::DataType)` is deprecated, use `nameof(t)` instead.
│   caller = datatype at DataStreams.jl:464 [inlined]
└ @ Core DataStreams.jl:464
┌ Warning: `eval(m, x)` is deprecated, use `Core.eval(m, x)` instead.
│   caller = datatype at DataStreams.jl:464 [inlined]
└ @ Core DataStreams.jl:464
ERROR: UndefVarError: Inference not defined
Stacktrace:
 [1] getproperty at ./sysimg.jl:13 [inlined]
 [2] (::getfield(DataStreams.Data, Symbol("##8#11")){Tuple{Union,Union},Tuple{typeof(identity),typeof(identity)}})(::Int64) at ./<missing>:0
 [3] iterate at ./generator.jl:46 [inlined]
 [4] append_any(::Base.Generator{UnitRange{Int64},getfield(DataStreams.Data, Symbol("##8#11")){Tuple{Union,Union},Tuple{typeof(identity),typeof(identity)}}}) at ./essentials.jl:399
 [5] transform(::DataStreams.Data.Schema{true,Tuple{Union{Missing, String},Union{Missing, Int32}}}, ::Dict{Int64,Function}, ::Any) at /home/js/.julia/packages/DataStreams/4kJ4/src/DataStreams.jl:94
 [6] #stream!#17(::Bool, ::Dict{Int64,Function}, ::Function, ::Array{Any,1}, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Any, ::LibPQ.Result, ::Type{DataFrame}) at /home/js/.julia/packages/DataStreams/4kJ4/src/DataStreams.jl:481
 [7] stream!(::LibPQ.Result, ::Type{DataFrame}) at /home/js/.julia/packages/DataStreams/4kJ4/src/DataStreams.jl:475
 [8] #fetch!#45 at /home/js/.julia/packages/LibPQ/N7lD/src/datastreams.jl:138 [inlined]
 [9] fetch!(::Type, ::LibPQ.Result) at /home/js/.julia/packages/LibPQ/N7lD/src/datastreams.jl:134
 [10] top-level scope
 versioninfo()
WARNING: importing deprecated binding Base.uninitialized into OffsetArrays.
WARNING: Base.uninitialized is deprecated, use undef instead.
 in module OffsetArrays
WARNING: Base.uninitialized is deprecated, use undef instead.
 in module OffsetArrays
WARNING: Base.uninitialized is deprecated, use undef instead.
 in module OffsetArrays
Julia Version 0.7.0-alpha.31
Commit 387f492184* (2018-06-04 13:25 UTC)
Platform Info:
  OS: Linux (x86_64-linux-gnu)
  CPU: Intel(R) Core(TM) i7-5600U CPU @ 2.60GHz
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-6.0.0 (ORCJIT, broadwell)

No problem on 0.6.3

ArgumentError: invalid BigInt: "NaN"

When running

result = execute(conn, "SELECT * FROM table limit 100;")
data = Data.stream!(result, NamedTuple)

it shows

ArgumentError: invalid BigInt: "NaN"

Stacktrace:
 [1] tryparse_internal(::Type{BigInt}, ::String, ::Int64, ::Int64, ::Int64, ::Bool) at ./gmp.jl:260
 [2] #parse#348(::Nothing, ::Function, ::Type{BigInt}, ::String) at ./parse.jl:238
 [3] parse at ./parse.jl:238 [inlined]
 [4] parameters at /home/gsc/.julia/packages/Decimals/Qfcas/src/decimal.jl:19 [inlined]
 [5] parse(::Type{Decimals.Decimal}, ::String) at /home/gsc/.julia/packages/Decimals/Qfcas/src/decimal.jl:6
 [6] parse(::Type{Decimals.Decimal}, ::LibPQ.PQValue{0x000006a4}) at /home/gsc/.julia/packages/LibPQ/LODsS/src/parsing.jl:110
 [7] (::getfield(LibPQ, Symbol("#parse_type#41")){DataType})(::LibPQ.PQValue{0x000006a4}) at /home/gsc/.julia/packages/LibPQ/LODsS/src/parsing.jl:337
 [8] streamfrom(::LibPQ.Result, ::Type{DataStreams.Data.Field}, ::Type{Union{Missing, Decimal}}, ::Int64, ::Int64) at /home/gsc/.julia/packages/LibPQ/LODsS/src/datastreams.jl:60
 [9] macro expansion at /home/gsc/.julia/packages/DataStreams/saxcP/src/query.jl:474 [inlined]
 [10] stream!(::LibPQ.Result, ::DataStreams.Data.Query{0x01,Tuple{DataStreams.Data.QueryColumn{0x01,Union{Missing, ZonedDateTime},1,1,:time,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missing, String},2,2,:symbol,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missing, Decimal},3,3,:macd,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missing, Decimal},4,4,:diff,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missing, Decimal},5,5,:dea,nothing,()}},(),nothing,nothing}, ::Type{DataStreams.Data.Field}, ::NamedTuple{(:time, :symbol, :macd, :diff, :dea),Tuple{Array{Union{Missing, ZonedDateTime},1},Array{Union{Missing, String},1},Array{Union{Missing, Decimal},1},Array{Union{Missing, Decimal},1},Array{Union{Missing, Decimal},1}}}, ::DataStreams.Data.Schema{true,Tuple{Union{Missing, ZonedDateTime},Union{Missing, String},Union{Missing, Decimal},Union{Missing, Decimal},Union{Missing, Decimal}}}, ::Int64) at /home/gsc/.julia/packages/DataStreams/saxcP/src/query.jl:618
 [11] #stream!#113(::Bool, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::LibPQ.Result, ::DataStreams.Data.Query{0x01,Tuple{DataStreams.Data.QueryColumn{0x01,Union{Missing, ZonedDateTime},1,1,:time,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missing, String},2,2,:symbol,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missing, Decimal},3,3,:macd,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missing, Decimal},4,4,:diff,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missing, Decimal},5,5,:dea,nothing,()}},(),nothing,nothing}, ::Type{NamedTuple}) at /home/gsc/.julia/packages/DataStreams/saxcP/src/query.jl:588
 [12] (::getfield(DataStreams.Data, Symbol("#kw##stream!")))(::NamedTuple{(:append,),Tuple{Bool}}, ::typeof(DataStreams.Data.stream!), ::LibPQ.Result, ::DataStreams.Data.Query{0x01,Tuple{DataStreams.Data.QueryColumn{0x01,Union{Missing, ZonedDateTime},1,1,:time,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missing, String},2,2,:symbol,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missing, Decimal},3,3,:macd,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missing, Decimal},4,4,:diff,nothing,()},DataStreams.Data.QueryColumn{0x01,Union{Missing, Decimal},5,5,:dea,nothing,()}},(),nothing,nothing}, ::Type{NamedTuple}) at ./none:0
 [13] #stream!#111(::Bool, ::Dict{Int64,Function}, ::Function, ::Array{Any,1}, ::Array{Any,1}, ::Nothing, ::Nothing, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::typeof(DataStreams.Data.stream!), ::LibPQ.Result, ::Type{NamedTuple}) at /home/gsc/.julia/packages/DataStreams/saxcP/src/query.jl:538
 [14] stream!(::LibPQ.Result, ::Type{NamedTuple}) at /home/gsc/.julia/packages/DataStreams/saxcP/src/query.jl:516
 [15] top-level scope at In[26]:1

There is NaN in the table, don' t know why insertion works while selection doesn' t work.

deps.jl missing

I managed to add latest master but after using LibPQ I get

LoadError: LoadError: could not open file C:\Users\pn\.julia\v0.6\LibPQ\src\..\deps\deps.jl
while loading C:\Users\pn\.julia\v0.6\LibPQ\src\LibPQ.jl, in expression starting on line 43

The file seems to be missing.

Support additional client encodings

This is a bunch of work. This requires:

  • a copy of the encoding stored in the Result object (the connection can change encoding but this doesn't affect existing results)
  • decode all strings fetched from the DB and encode all strings sent to the DB
  • figure out what data, aside from query parameters, follow client_encoding
  • maybe include an encoded string type so we don't have to convert things (this is probably hard)

Introduce Exception types

All the error calls could be different exception types. In addition, exception types representing PostgreSQL errors should contain a reference to the connection, to enable follow-up actions (including printing more verbose errors).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.