Code Monkey home page Code Monkey logo

db-to-sqlite's Introduction

db-to-sqlite

PyPI Changelog Tests License

CLI tool for exporting tables or queries from any SQL database to a SQLite file.

Installation

Install from PyPI like so:

pip install db-to-sqlite

If you want to use it with MySQL, you can install the extra dependency like this:

pip install 'db-to-sqlite[mysql]'

Installing the mysqlclient library on OS X can be tricky - I've found this recipe to work (run that before installing db-to-sqlite).

For PostgreSQL, use this:

pip install 'db-to-sqlite[postgresql]'

Usage

Usage: db-to-sqlite [OPTIONS] CONNECTION PATH

  Load data from any database into SQLite.

  PATH is a path to the SQLite file to create, e.c. /tmp/my_database.db

  CONNECTION is a SQLAlchemy connection string, for example:

      postgresql://localhost/my_database
      postgresql://username:passwd@localhost/my_database

      mysql://root@localhost/my_database
      mysql://username:passwd@localhost/my_database

  More: https://docs.sqlalchemy.org/en/13/core/engines.html#database-urls

Options:
  --version                     Show the version and exit.
  --all                         Detect and copy all tables
  --table TEXT                  Specific tables to copy
  --skip TEXT                   When using --all skip these tables
  --redact TEXT...              (table, column) pairs to redact with ***
  --sql TEXT                    Optional SQL query to run
  --output TEXT                 Table in which to save --sql query results
  --pk TEXT                     Optional column to use as a primary key
  --index-fks / --no-index-fks  Should foreign keys have indexes? Default on
  -p, --progress                Show progress bar
  --postgres-schema TEXT        PostgreSQL schema to use
  --help                        Show this message and exit.

For example, to save the content of the blog_entry table from a PostgreSQL database to a local file called blog.db you could do this:

db-to-sqlite "postgresql://localhost/myblog" blog.db \
    --table=blog_entry

You can specify --table more than once.

You can also save the data from all of your tables, effectively creating a SQLite copy of your entire database. Any foreign key relationships will be detected and added to the SQLite database. For example:

db-to-sqlite "postgresql://localhost/myblog" blog.db \
    --all

When running --all you can specify tables to skip using --skip:

db-to-sqlite "postgresql://localhost/myblog" blog.db \
    --all \
    --skip=django_migrations

If you want to save the results of a custom SQL query, do this:

db-to-sqlite "postgresql://localhost/myblog" output.db \
    --output=query_results \
    --sql="select id, title, created from blog_entry" \
    --pk=id

The --output option specifies the table that should contain the results of the query.

Using db-to-sqlite with PostgreSQL schemas

If the tables you want to copy from your PostgreSQL database aren't in the default schema, you can specify an alternate one with the --postgres-schema option:

db-to-sqlite "postgresql://localhost/myblog" blog.db \
    --all \
    --postgres-schema my_schema

Using db-to-sqlite with MS SQL

The best way to get the connection string needed for the MS SQL connections below is to use urllib from the Standard Library as below

params = urllib.parse.quote_plus(
    "DRIVER={SQL Server Native Client 11.0};"
    "SERVER=localhost;"
    "DATABASE=my_database;"
    "Trusted_Connection=yes;"
)

The above will resolve to

DRIVER%3D%7BSQL+Server+Native+Client+11.0%7D%3B+SERVER%3Dlocalhost%3B+DATABASE%3Dmy_database%3B+Trusted_Connection%3Dyes

You can then use the string above in the odbc_connect below

mssql+pyodbc:///?odbc_connect=DRIVER%3D%7BSQL+Server+Native+Client+11.0%7D%3B+SERVER%3Dlocalhost%3B+DATABASE%3Dmy_database%3B+Trusted_Connection%3Dyes
mssql+pyodbc:///?odbc_connect=DRIVER%3D%7BSQL+Server+Native+Client+11.0%7D%3B+SERVER%3Dlocalhost%3B+DATABASE%3Dmy_database%3B+UID%3Dusername%3B+PWD%3Dpasswd

Using db-to-sqlite with Heroku Postgres

If you run an application on Heroku using their Postgres database product, you can use the heroku config command to access a compatible connection string:

$ heroku config --app myappname | grep HEROKU_POSTG
HEROKU_POSTGRESQL_OLIVE_URL: postgres://username:[email protected]:5432/dbname

You can pass this to db-to-sqlite to create a local SQLite database with the data from your Heroku instance.

You can even do this using a bash one-liner:

$ db-to-sqlite $(heroku config --app myappname | grep HEROKU_POSTG | cut -d: -f 2-) \
    /tmp/heroku.db --all -p
1/23: django_migrations
...
17/23: blog_blogmark
[####################################]  100%
...

Related projects

  • Datasette: A tool for exploring and publishing data. Works great with SQLite files generated using db-to-sqlite.
  • sqlite-utils: Python CLI utility and library for manipulating SQLite databases.
  • csvs-to-sqlite: Convert CSV files into a SQLite database.

Development

To set up this tool locally, first checkout the code. Then create a new virtual environment:

cd db-to-sqlite
python3 -m venv venv
source venv/bin/activate

Or if you are using pipenv:

pipenv shell

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

pytest

This will skip tests against MySQL or PostgreSQL if you do not have their additional dependencies installed.

You can install those extra dependencies like so:

pip install -e '.[test_mysql,test_postgresql]'

You can alternative use pip install psycopg2-binary if you cannot install the psycopg2 dependency used by the test_postgresql extra.

See Running a MySQL server using Homebrew for tips on running the tests against MySQL on macOS, including how to install the mysqlclient dependency.

The PostgreSQL and MySQL tests default to expecting to run against servers on localhost. You can use environment variables to point them at different test database servers:

  • MYSQL_TEST_DB_CONNECTION - defaults to mysql://root@localhost/test_db_to_sqlite
  • POSTGRESQL_TEST_DB_CONNECTION - defaults to postgresql://localhost/test_db_to_sqlite

The database you indicate in the environment variable - test_db_to_sqlite by default - will be deleted and recreated on every test run.

db-to-sqlite's People

Contributors

jdormit avatar ryancheley avatar simonw avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

db-to-sqlite's Issues

--redact tablename colname option

It might be useful if the CLI could redact specific column.

db-to-sqlite data.db postgres://... --all \
    --redact auth_user password --redact auth_user email

Refactor unit tests to be parametrized against MySQL and PostgreSQL

Something like this:

DATABASES = (
  ("MySQL", mysqldb),
  ("PostgreSQL", psycopg2)
)

@pytest.mark.parametrized("name,mod...):
def test_redact(name, mod):
    if mod is None:
          pytest.skip("Skipped {}".format(name))

I could define my own decorator which does the parameterized thing AND the fixture registration:

@test_databases
def test_redact(filepath, connection):
    # ...

StopIteration error thrown on empty tables

(db-to-sqlite) bash-3.2$ python -i $(which db-to-sqlite) postgresql://localhost:5433/db --all db.db
Traceback (most recent call last):
  File "/Users/simonw/.local/share/virtualenvs/db-to-sqlite-CRf3fZQ-/bin/db-to-sqlite", line 11, in <module>
    load_entry_point('db-to-sqlite', 'console_scripts', 'db-to-sqlite')()
  File "/Users/simonw/.local/share/virtualenvs/db-to-sqlite-CRf3fZQ-/lib/python3.6/site-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/Users/simonw/.local/share/virtualenvs/db-to-sqlite-CRf3fZQ-/lib/python3.6/site-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/Users/simonw/.local/share/virtualenvs/db-to-sqlite-CRf3fZQ-/lib/python3.6/site-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/Users/simonw/.local/share/virtualenvs/db-to-sqlite-CRf3fZQ-/lib/python3.6/site-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/Users/simonw/Desktop/Development/db-to-sqlite/db_to_sqlite/cli.py", line 103, in cli
    db[table].upsert_all(rows, pk=pk)
  File "/Users/simonw/.local/share/virtualenvs/db-to-sqlite-CRf3fZQ-/lib/python3.6/site-packages/sqlite_utils/db.py", line 1018, in upsert_all
    extracts=extracts,
  File "/Users/simonw/.local/share/virtualenvs/db-to-sqlite-CRf3fZQ-/lib/python3.6/site-packages/sqlite_utils/db.py", line 885, in insert_all
    first_record = next(records)
StopIteration

This is actually due to sqlite_utils being unable to handle empty lists passed to upsert_all.

MySQL tests fail after multiple runs

I noticed while running the tests locally that the MySQL tests started failing with an error (complete summary at the bottom):

MySQLdb.IntegrityError: (1062, "Duplicate entry '1' for key 'vendors.PRIMARY'")

I compared the setup blocks for MySQL and PostgreSQL and I see one difference is that the PostgreSQL block drops databases during setup while the MySQL block does not. @simonw do you test locally and have a setup that works for you? I'll submit a PR linked to this issue for how I worked around it.

Btw: I followed the README closely. I see that the tests on CI pass which I assume is because the databases are wiped on reach run.

Environment:

  • OS: macos (arm64)
  • Python: 3.11
  • db-to-sqlite version: v1.5 (b7c95fc)
pytest output summary
rs.PRIMARY'")
ERROR tests/test_docs.py::test_readme_contains_latest_help - MySQLdb.IntegrityError: (1062, "Duplicate entry '1' for key 'vendors.PRIMARY'")
ERROR tests/test_fixtures.py::test_fixture_mysql - MySQLdb.IntegrityError: (1062, "Duplicate entry '1' for key 'vendors.PRIMARY'")
ERROR tests/test_fixtures.py::test_fixture_postgresql - MySQLdb.IntegrityError: (1062, "Duplicate entry '1' for key 'vendors.PRIMARY'")
ERROR tests/test_redact.py::test_redact[mysql://[email protected]:3306/test_db_to_sqlite] - MySQLdb.IntegrityError: (1062, "Duplicate entry '1' for key 'vendors.PRIMARY'")
ERROR tests/test_redact.py::test_redact[postgresql://localhost/test_db_to_sqlite] - MySQLdb.IntegrityError: (1062, "Duplicate entry '1' for key 'vendors.PRIMARY'")
ERROR tests/test_redact.py::test_redact[postgres://localhost/test_db_to_sqlite] - MySQLdb.IntegrityError: (1062, "Duplicate entry '1' for key 'vendors.PRIMARY'")
=========================================================== 19 errors in 0.13s ============================================================

Add option for selecting tables to copy by a (regex) pattern

Besides just --all to select all tables, --table to select specific table, or --sql to run a custom SQL query, it would be nice to be able to select table to copy via a regex pattern for table names. It could look something like this

db-to-sqlite "postgresql://localhost/myblog" blog.db \
    --table-name-pattern "tag_\w+"

postgresql - search_path

I ran this:

db-to-sqlite "postgresql://user:[email protected]/dbname" sqlite.db --progress --table=artist

I kept getting this error.

Traceback (most recent call last):
  File "/home/andres/.local/bin/db-to-sqlite", line 8, in <module>
    sys.exit(cli())
  File "/home/andres/.local/lib/python3.8/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/home/andres/.local/lib/python3.8/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/home/andres/.local/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/andres/.local/lib/python3.8/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/home/andres/.local/lib/python3.8/site-packages/db_to_sqlite/cli.py", line 75, in cli
    pks = inspector.get_pk_constraint(table)["constrained_columns"]
  File "/home/andres/.local/lib/python3.8/site-packages/sqlalchemy/engine/reflection.py", line 528, in get_pk_constraint
    return self.dialect.get_pk_constraint(
  File "<string>", line 2, in get_pk_constraint
  File "/home/andres/.local/lib/python3.8/site-packages/sqlalchemy/engine/reflection.py", line 55, in cache
    ret = fn(self, con, *args, **kw)
  File "/home/andres/.local/lib/python3.8/site-packages/sqlalchemy/dialects/postgresql/base.py", line 3863, in get_pk_constraint
    table_oid = self.get_table_oid(
  File "<string>", line 2, in get_table_oid
  File "/home/andres/.local/lib/python3.8/site-packages/sqlalchemy/engine/reflection.py", line 55, in cache
    ret = fn(self, con, *args, **kw)
  File "/home/andres/.local/lib/python3.8/site-packages/sqlalchemy/dialects/postgresql/base.py", line 3465, in get_table_oid
    raise exc.NoSuchTableError(table_name)
sqlalchemy.exc.NoSuchTableError: artist

It also happened if I added the full name of the table:

db-to-sqlite "postgresql://user:[email protected]/dbname" sqlite.db --progress --table=my_postgres_schema.artist

Then I added this after creating the db_conn variable, and it worked:
db_conn.execute('SET search_path TO <my_postgres_schema>')

Is there a way to get this to work without modifying the program, or am I doing something wrong?

incompatibility with sqlite-utils v2+

Hi @simonw and thanks for such great tools! I'm using datasette on http://data.greenferries.org/ , it's been a delight.

I think there is an incompatibility with the breaking changes you made on upsert_all in version 2+ of sqlite-utils.

I get this error when trying to convert a PostgresQL db:

$ db-to-sqlite --all "postgresql://localhost/greenferries_prod_tmp" greenferries.db
Traceback (most recent call last):
  File "/Users/adipasquale/.venvs/greenferries-data/bin/db-to-sqlite", line 8, in <module>
    sys.exit(cli())
  File "/Users/adipasquale/.venvs/greenferries-data/lib/python3.7/site-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/Users/adipasquale/.venvs/greenferries-data/lib/python3.7/site-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/Users/adipasquale/.venvs/greenferries-data/lib/python3.7/site-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/Users/adipasquale/.venvs/greenferries-data/lib/python3.7/site-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/Users/adipasquale/.venvs/greenferries-data/lib/python3.7/site-packages/db_to_sqlite/cli.py", line 111, in cli
    db[table].upsert_all(rows, pk=pk)
  File "/Users/adipasquale/.venvs/greenferries-data/lib/python3.7/site-packages/sqlite_utils/db.py", line 1159, in upsert_all
    upsert=True,
  File "/Users/adipasquale/.venvs/greenferries-data/lib/python3.7/site-packages/sqlite_utils/db.py", line 1086, in insert_all
    result = self.db.conn.execute(query, params)
sqlite3.OperationalError: near "WHERE": syntax error

I've managed to mitigate this problem by adding sqlite-utils==1.12.1 to my requirements.txt before db-to-sqlite[postgresql], and now it works as expected.

I tried to understand the actual problem with v2 but I'm not familiar enough with your codebase sorry :/ let me know if I can help somehow.

After digging, it seems to be linked to a specific table in my PostgreSQL db, namely the rails Schema migrations one that contains a single varchar column with 21 entries in my case (surprising). I've bissected it to db-to-sqlite --table schema_migrations "postgresql://localhost/greenferries_prod_tmp" greenferries.db triggering the error, I can skip it and it fixes the problem (with sqlite-utils v2+ that is).

I've uploaded a small SQL dump on gist so you can reproduce:

curl https://gist.githubusercontent.com/adipasquale/aae8ece24005f1a6e39b2b3bd529ce4f/raw/af456652df0618bd2b53b3676a9154644924a153/tmp.sql > tmp.sql
createdb brokendb
psql brokendb < tmp.sql
db-to-sqlite "postgresql://localhost/brokendb" greenferries.db

(btw I don't actually need this table in datasette, so I've only created this issue for reference).

sqlite3.OperationalError: table sqlite_master may not be modified

I get the following error when I try to use the db-to-sqlite command to export my postgres db into a .db file:
sqlite3.OperationalError: table sqlite_master may not be modified

I'm using db-to-sqlite 1.4, sqlite-utils 3.30, and Python 3.10.8. I installed db-to-sqlite with pip install 'db-to-sqlite[postgresql]'.

My command is something like:
db-to-sqlite "postgresql://localhost/my_db" my_exported_db.db --all --skip=django_migrations

Traceback:

File "/Users/user/.virtualenvs/myproject/bin/db-to-sqlite", line 8, in <module>
    sys.exit(cli())
  File "/Users/user/.virtualenvs/myproject/lib/python3.10/site-packages/click/core.py", line 1130, in __call__
    return self.main(*args, **kwargs)
  File "/Users/user/.virtualenvs/myproject/lib/python3.10/site-packages/click/core.py", line 1055, in main
    rv = self.invoke(ctx)
  File "/Users/user/.virtualenvs/myproject/lib/python3.10/site-packages/click/core.py", line 1404, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/Users/user/.virtualenvs/myproject/lib/python3.10/site-packages/click/core.py", line 760, in invoke
    return __callback(*args, **kwargs)
  File "/Users/user/.virtualenvs/myproject/lib/python3.10/site-packages/db_to_sqlite/cli.py", line 166, in cli
    db.add_foreign_keys(foreign_keys_to_add_final)
  File "/Users/user/.virtualenvs/myproject/lib/python3.10/site-packages/sqlite_utils/db.py", line 1096, in add_foreign_keys
    cursor.execute(
sqlite3.OperationalError: table sqlite_master may not be modified

The only workaround I found online was this comment by awpears, but that means overwriting the db.py file

pip install blows up when it can't find mysql?

When trying to install this command, the "pip install" process does not check to see if there is a mysql command that has been previously installed, and bombs out with a nasty error:

$ pip3 install db-to-sqlite[mysql]
Collecting db-to-sqlite[mysql]
  Downloading https://files.pythonhosted.org/packages/36/3a/d758ef2009d6f33499c2ffcf7044d3fa52b03a1c96a7e53320aa0db90951/db_to_sqlite-1.0.1-py3-none-any.whl
Collecting sqlalchemy (from db-to-sqlite[mysql])
  Downloading https://files.pythonhosted.org/packages/62/3c/9dda60fd99dbdcbc6312c799a3ec9a261f95bc12f2874a35818f04db2dd9/SQLAlchemy-1.3.5.tar.gz (5.9MB)
    100% |████████████████████████████████| 5.9MB 5.2MB/s 
Collecting sqlite-utils>=1.4 (from db-to-sqlite[mysql])
  Downloading https://files.pythonhosted.org/packages/4f/f9/783c91652ef8ed4b0378f570c34b1cc6da03a34e59565127396153c6882d/sqlite_utils-1.4-py3-none-any.whl
Collecting click (from db-to-sqlite[mysql])
  Downloading https://files.pythonhosted.org/packages/fa/37/45185cb5abbc30d7257104c434fe0b07e5a195a6847506c074527aa599ec/Click-7.0-py2.py3-none-any.whl (81kB)
    100% |████████████████████████████████| 81kB 10.7MB/s 
Collecting mysqlclient; extra == "mysql" (from db-to-sqlite[mysql])
  Downloading https://files.pythonhosted.org/packages/f4/f1/3bb6f64ca7a429729413e6556b7ba5976df06019a5245a43d36032f1061e/mysqlclient-1.4.2.post1.tar.gz (85kB)
    100% |████████████████████████████████| 92kB 15.9MB/s 
    Complete output from command python setup.py egg_info:
    /bin/sh: mysql_config: command not found
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/private/var/folders/7_/4p6y50v5595f9bhdpmhk3wbmt608x8/T/pip-install-39mqg7so/mysqlclient/setup.py", line 16, in <module>
        metadata, options = get_config()
      File "/private/var/folders/7_/4p6y50v5595f9bhdpmhk3wbmt608x8/T/pip-install-39mqg7so/mysqlclient/setup_posix.py", line 51, in get_config
        libs = mysql_config("libs")
      File "/private/var/folders/7_/4p6y50v5595f9bhdpmhk3wbmt608x8/T/pip-install-39mqg7so/mysqlclient/setup_posix.py", line 29, in mysql_config
        raise EnvironmentError("%s not found" % (_mysql_config_path,))
    OSError: mysql_config not found
    
    ----------------------------------------
Command "python setup.py egg_info" failed with error code 1 in /private/var/folders/7_/4p6y50v5595f9bhdpmhk3wbmt608x8/T/pip-install-39mqg7so/mysqlclient/

[1]+  Stopped                 pip3 install db-to-sqlite[mysql]

It would be great if the pip install process could be made a bit more intelligent, so that it refuses to install anything if it can't find a mysql command to work with.

Upgrade for compatibility with SQLAlchemy 2

Hi.

I'm trying to use the tool to export a Dockerized PostgresDB to a SQLite DB in order to read it with Datasette.

But when I try to run the command get the following error:

python3 -m venv venv
source venv/bin/activate
pip install db-to-sqlite
pip install 'db-to-sqlite[postgresql]'
And then:

❯ db-to-sqlite "postgresql://user:password@localhost:8080/my_db" demo.db \
    --all \
    --postgres-schema public
Traceback (most recent call last):
  File "/root/datasette/venv/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1410, in execute
    meth = statement._execute_on_connection
AttributeError: 'str' object has no attribute '_execute_on_connection'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/root/datasette/venv/bin/db-to-sqlite", line 8, in <module>
    sys.exit(cli())
  File "/root/datasette/venv/lib/python3.10/site-packages/click/core.py", line 1130, in __call__
    return self.main(*args, **kwargs)
  File "/root/datasette/venv/lib/python3.10/site-packages/click/core.py", line 1055, in main
    rv = self.invoke(ctx)
  File "/root/datasette/venv/lib/python3.10/site-packages/click/core.py", line 1404, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/root/datasette/venv/lib/python3.10/site-packages/click/core.py", line 760, in invoke
    return __callback(*args, **kwargs)
  File "/root/datasette/venv/lib/python3.10/site-packages/db_to_sqlite/cli.py", line 112, in cli
    results = db_conn.execute("select * from {}".format(table_quoted))
  File "/root/datasette/venv/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1412, in execute
    raise exc.ObjectNotExecutableError(statement) from err
sqlalchemy.exc.ObjectNotExecutableError: Not an executable object: 'select * from "some_table"'

I'm using PG 13.9-1.pgdg110+1 in Docker Desktop 4.16.3 (96739).
Python 3.10.6
Ubuntu 22.04.1 LTS with WSL2

ValueError

I'm trying to back up a postgres db to sqlite, but running in to this issue:

ValueError: year 275760 is out of range

This database has a expires_at which is set to 'never', which in this has it just set very far into the future so far that Python cannot deal with it:

COPY omnivore.api_key (id, user_id, name, key, scopes, expires_at, created_at, used_at) FROM stdin;
9702de10-f681-11ee-a194-87a8719f2d3d	deee6128-f25c-11ee-991f-df4a5f9dc67c	Test	82eb3b1c190d7b1c3163aa2387ce67c8310c7e1b554259ac68825a43c00bc1cf	{}	275760-09-13 00:00:00+00	2024-04-09 14:58:13.943123+00	\N

Since I'm selfhosting this software I'm just going to patch the never date with a date like 01-01-9999 that makes sense in python as a workaround. Any idea on how to deal with these kind of errors as a user more generally when using db-to-sqlite? Is there any way to handle errors?

Support multiple --sql and --output pairs

Like this:

$ db-to-sqlite mysql://localhost/blah blah.db \
    --sql "select id, name from categories" --output="cats" \
    --sql "select id, name from authors" --output="authors"

Is it possible to skip tables if they do not exist?

I have the case where I run this CLI in batch in several databases with a list of tables I want to import. At the moment the tables I am pulling are in all the databases but now I need to pull some tables that are just in some of them.

Is possible to use a parameter or configuration in the CLI that allows me to "skip if does not exist"?

Thank you.

Oracle number columns become integer in sqlite

The database I'm working with has several columns set as number in Oracle. This utility creates the fields as integer in sqlite. It works if I remove the two fields that are number fields from my query. Otherwise it runs up until it encounters a noninteger number and then errors out with "Error binding parameter 64 - probably unsupported type"
Additionally, I'm currently only including 6 fields in my select query, I don't know why it has 64 or more parameters.

Multiple Postgres Schemas?

Is it possible to export multiple postgres schemas? Ideally I would like to provide a comma separate list of schemas...

db-to-sqlite --all --progress --postgres-schema schema1,schema2,schema3 CONNECTION PATH

Or is there a workaround whereby I can repeat the export one schema at a time into the same SQLite database?

TZ flag for migrating datetime columns?

In my original DB, all datetime fields are in UTC.
It should be possible (using a flag perhaps?) to convert the datetimes to be without timezone in sqlite3?
Right now the resulting value causes a few issues.

Sample postgres table (partial):

id,created_at
2,2021-07-30 11:44:09.984439 +00:00
3,2021-07-30 11:44:29.217916 +00:00
4,2021-07-30 11:44:43.598702 +00:00
5,2021-08-03 20:53:48.482419 +00:00
6,2021-08-04 22:19:52.810907 +00:00
7,2021-08-05 17:25:29.646553 +00:00
8,2021-08-05 17:25:33.425523 +00:00
9,2021-08-05 17:26:22.169369 +00:00

Sample result table:

2,2021-07-30T11:44:09.984439+00:00
3,2021-07-30T11:44:29.217916+00:00
4,2021-07-30T11:44:43.598702+00:00
5,2021-08-03T20:53:48.482419+00:00
6,2021-08-04T22:19:52.810907+00:00
7,2021-08-05T17:25:29.646553+00:00
8,2021-08-05T17:25:33.425523+00:00
9,2021-08-05T17:26:22.169369+00:00

Expected format:

285,2021-11-29 12:40:20.021350
286,2021-11-29 12:40:20.023282
287,2021-11-29 12:40:20.024771
288,2021-11-29 12:40:20.026995

Can't load plugin: sqlalchemy.dialects:postgres

Getting this error running the Heroku example from the README:

    $ db-to-sqlite $(heroku config --app myappname | grep HEROKU_POSTG | cut -d: -f 2-) \

https://stackoverflow.com/a/66794960/6083 explains why:

SQLAlchemy 1.4 removed the deprecated postgres dialect name, the name postgresql must be used instead now. The dialect is the part before the :// in the URL. SQLAlchemy 1.3 and earlier showed a deprecation warning but still accepted it.

To fix this, rename postgres:// in the URL to postgresql://.

Correctly duplicate empty tables

I'm just trying out this library with a blank django project. I've a project on heroku and I want to replicate the heroku database locally in sqlite, but it seems that db-to-sqlite is skipping some tables.

Command run -

db-to-sqlite postgres://uid:pwd@host/dbname db.sqlite3 --all -p

Output -

1/10: django_migrations
2/10: django_content_type
3/10: auth_permission
4/10: auth_group
5/10: auth_group_permissions
6/10: auth_user_groups
7/10: auth_user_user_permissions
8/10: django_admin_log
9/10: auth_user
10/10: django_session

Adding 1 foreign key
  auth_permission.content_type_id => django_content_type.id

When tried to run django locally, faced an error that django could not find django_admin_log, so I tried to see if tables have been copied correctly or not-

$ sqlite3 db.sqlite3
SQLite version 3.22.0 2018-01-22 18:45:57
Enter ".help" for usage hints.
sqlite> .tables
auth_permission      django_content_type  django_session
auth_user            django_migrations
sqlite>

MySQL bool columns end up as SQLite BLOB columns upon conversion

I'm trying to use db-to-sqlite to convert a set of MySQL DDL files into an SQLite database. Some of my tables use the BIT(1) type to encode boolean values but the resulting SQLite database ends up with BLOB columns where I think INTEGER would be more natural for SQLite.

This is new territory for me but things I've picked up while researching my issue are:

  • MySQL doesn't have a full-fledged boolean type, similar to SQLite
  • People either use TINYINT or BIT(1) to store bools in a space-efficient way
    • TINYINT(1) may have even been an alias for BIT(1) at some point (source) but is no longer

I think it would be nice if db-to-sqlite could turn BIT(1) fields into INTEGER rather than BLOB because the current behavior is harder to query than it needs to be.

I'm happy to submit a PR for this with tests covering this and any related behavior.

TypeError: 'NoneType' object is not iterable

Trying to run db-to-sqlite --progress --all --skip alembic_version --redact users password_hash <redacted-postgres-url> dump.sqlite (using version 1.2.1) gives me this after/during foreign key handling.

The output (with tables aliased):

1/10: table_1
2/10: alembic_version
  ... skipping
3/10: table_2
4/10: table_3
5/10: table_4
6/10: table_5
7/10: table_6
8/10: table_7
9/10: table_8
10/10: table_8

Adding 9 foreign keys
  table_2.column_1 => table_1.id
  table_3.column_2 => table_2.id
  table_3.column_3 => table_4.id
  table_4.column_1 => table_1.id
  table_6.column_4 => table_4.id
  table_6.column_2 => table_2.id
  table_6.column_1 => table_1.id
  table_7.column_5 => table_6.id
  table_7.column_3 => table_4.id
Traceback (most recent call last):
  File "/virtualenv/bin/db-to-sqlite", line 8, in <module>
    sys.exit(cli())
  File "/virtualenv/lib64/python3.7/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/virtualenv/lib64/python3.7/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/virtualenv/lib64/python3.7/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/virtualenv/lib64/python3.7/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/virtualenv/lib64/python3.7/site-packages/db_to_sqlite/cli.py", line 150, in cli
    db.index_foreign_keys()
  File "/virtualenv/lib64/python3.7/site-packages/sqlite_utils/db.py", line 423, in index_foreign_keys
    i.columns[0] for i in table.indexes if len(i.columns) == 1
  File "/virtualenv/lib64/python3.7/site-packages/sqlite_utils/db.py", line 585, in indexes
    for row in self.db.execute_returning_dicts(sql):
  File "/virtualenv/lib64/python3.7/site-packages/sqlite_utils/db.py", line 172, in execute_returning_dicts
    keys = [d[0] for d in cursor.description]
TypeError: 'NoneType' object is not iterable

Error binding parameter X - probably unsupported type

Hey !

First of all, thanks for this script, it saved to me a lot of time ! 👍

I got a problem when I try to use it with a MySQL database that I need to export every week. So I am trying to automatise this.

PS C:\Users\user\Desktop\db-to-sqlite-master\db_to_sqlite> db-to-sqlite "mysql://root:@localhost:3306/gtrx" gtrx.db --output=query_results --sql="select * from animal_2 where IdANIMAL='US000135746776'" --pk=id Traceback (most recent call last): File "c:\users\user\appdata\local\programs\python\python37\lib\runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "c:\users\user\appdata\local\programs\python\python37\lib\runpy.py", line 85, in _run_code exec(code, run_globals) File "C:\Users\user\AppData\Local\Programs\Python\Python37\Scripts\db-to-sqlite.exe\__main__.py", line 9, in <module> File "c:\users\user\appdata\local\programs\python\python37\lib\site-packages\click\core.py", line 764, in __call__ return self.main(*args, **kwargs) File "c:\users\user\appdata\local\programs\python\python37\lib\site-packages\click\core.py", line 717, in main rv = self.invoke(ctx) File "c:\users\user\appdata\local\programs\python\python37\lib\site-packages\click\core.py", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File "c:\users\user\appdata\local\programs\python\python37\lib\site-packages\click\core.py", line 555, in invoke return callback(*args, **kwargs) File "c:\users\user\appdata\local\programs\python\python37\lib\site-packages\db_to_sqlite\cli.py", line 148, in cli db[output].insert_all(rows, pk=pk) File "c:\users\user\appdata\local\programs\python\python37\lib\site-packages\sqlite_utils\db.py", line 1024, in insert_all result = self.db.conn.execute(sql, values) sqlite3.InterfaceError: Error binding parameter 18 - probably unsupported type.

It's working with some lines before this message because they got "NULL" data. But once I reach the line with ID 'US000135746776' this error is throw.

This is the table structure :
CREATE TABLE IF NOT EXISTS `animal` ( `IdANIMAL` varchar(16) NOT NULL, `LAST_DATE` text, `IdCHEPTEL` int(11) DEFAULT NULL, `NOBOVIC` text, `NOBOVIL` text, `NOBOET` text, `M` text, `AGMMP` text, `AGMPP` text, `AGMPM` text, `AGMMM` text, `AGPMP` text, `AGPPP` text, `AGPPM` text, `AGPMM` text, `GPP` text, `GMP` text, `GMM` text, `GPM` text, `P` text, `INDEX_VVEL_ORI` char(1) DEFAULT NULL, `INDEX_EC_ORI` char(1) DEFAULT NULL, `INDEX_LT_ORI` char(1) DEFAULT NULL, `INDEX_FERG_CD` tinyint(2) unsigned DEFAULT NULL, `INDEX_IA_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_REPRO_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_INEL_CD` tinyint(2) unsigned DEFAULT NULL, `INDEX_PC_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_FERV_ORI` char(1) DEFAULT NULL, `INDEX_MACL_CD` tinyint(2) unsigned DEFAULT NULL, `INDEX_LP_ORI` char(1) DEFAULT NULL, `INDEX_ME_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_PI_ORI` char(1) DEFAULT NULL, `INDEX_FNAI_ORI` char(1) DEFAULT NULL, `INDEX_TB_ORI` char(1) DEFAULT NULL, `INDEX_MU_VAL3` smallint(6) DEFAULT NULL, `INDEX_FVEL_CD` tinyint(2) unsigned DEFAULT NULL, `INDEX_LAIT_ORI` char(1) DEFAULT NULL, `INDEX_MA_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_VNAI_CD` tinyint(2) unsigned DEFAULT NULL, `INDEX_CC_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_FVEL_ORI` char(1) DEFAULT NULL, `INDEX_INEL_VAL3` smallint(6) DEFAULT NULL, `INDEX_LAIT_VAL4` smallint(6) DEFAULT NULL, `INDEX_TP_ORI` char(1) DEFAULT NULL, `INDEX_MO_CD` tinyint(2) unsigned DEFAULT NULL, `INDEX_MO_REF` varchar(12) DEFAULT NULL, `INDEX_AS_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_TE_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_MACL_ORI` char(1) DEFAULT NULL, `INDEX_HS_ORI` char(1) DEFAULT NULL, `INDEX_VVEL_VALSM` smallint(6) DEFAULT NULL, `INDEX_TB_VAL31` decimal(3,1) DEFAULT NULL, `INDEX_FERV_CD` tinyint(2) unsigned DEFAULT NULL, `INDEX_INEL_ORI` char(1) DEFAULT NULL, `INDEX_IVIA1_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_IVIA1_CD` tinyint(2) unsigned DEFAULT NULL, `INDEX_STMA_ORI` char(1) DEFAULT NULL, `INDEX_HS_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_LGF_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_TE_ORI` char(1) DEFAULT NULL, `INDEX_MO_NF` int(11) DEFAULT NULL, `INDEX_IB_ORI` char(1) DEFAULT NULL, `INDEX_PS_ORI` char(1) DEFAULT NULL, `INDEX_FERG_ORI` char(1) DEFAULT NULL, `INDEX_LGF_CD` tinyint(2) unsigned DEFAULT NULL, `INDEX_LAIT_NF` mediumint(9) DEFAULT NULL, `INDEX_AJ_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_ISU_REF` varchar(12) DEFAULT NULL, `INDEX_MP_ORI` char(1) DEFAULT NULL, `INDEX_EQ_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_AH_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_LO_ORI` char(1) DEFAULT NULL, `INDEX_CELL_ORI` char(1) DEFAULT NULL, `INDEX_AS_ORI` char(1) DEFAULT NULL, `INDEX_LT_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_CC_ORI` char(1) DEFAULT NULL, `INDEX_REPRO_ORI` char(1) DEFAULT NULL, `INDEX_ME_ORI` char(1) DEFAULT NULL, `INDEX_IS_ORI` char(1) DEFAULT NULL, `INDEX_FERV_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_AA_ORI` char(1) DEFAULT NULL, `INDEX_VNAI_VALSM` smallint(6) DEFAULT NULL, `INDEX_MG_VAL3` smallint(6) DEFAULT NULL, `INDEX_AH_ORI` char(1) DEFAULT NULL, `INDEX_LO_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_EA_ORI` char(1) DEFAULT NULL, `INDEX_LP_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_FNAI_CD` tinyint(2) unsigned DEFAULT NULL, `INDEX_MACL_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_IS_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_ISU_VAL3` smallint(3) unsigned DEFAULT NULL, `INDEX_AA_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_IA_ORI` char(1) DEFAULT NULL, `INDEX_STMA_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_MA_ORI` char(1) DEFAULT NULL, `INDEX_TR_ORI` char(1) DEFAULT NULL, `INDEX_PS_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_CELL_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_MO_ORI` char(1) DEFAULT NULL, `INDEX_LAIT_REF` varchar(12) DEFAULT NULL, `INDEX_FVEL_VALSM` smallint(6) DEFAULT NULL, `INDEX_EC_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_VVEL_CD` tinyint(2) unsigned DEFAULT NULL, `INDEX_MR_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_EA_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_MR_ORI` char(1) DEFAULT NULL, `INDEX_PC_ORI` char(1) DEFAULT NULL, `INDEX_CELL_CD` tinyint(2) unsigned DEFAULT NULL, `INDEX_FERG_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_PJ_ORI` char(1) DEFAULT NULL, `INDEX_ISU_ORI` char(1) DEFAULT NULL, `INDEX_MP_VAL3` smallint(6) DEFAULT NULL, `INDEX_PJ_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_MO_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_TP_VAL31` decimal(3,1) DEFAULT NULL, `INDEX_IVIA1_ORI` char(1) DEFAULT NULL, `INDEX_MG_ORI` char(1) DEFAULT NULL, `INDEX_LGF_ORI` char(1) DEFAULT NULL, `INDEX_TR_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_PI_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_VNAI_ORI` char(1) DEFAULT NULL, `INDEX_IB_VAL31` decimal(2,1) DEFAULT NULL, `INDEX_AJ_ORI` char(1) DEFAULT NULL, `INDEX_MU_ORI` char(1) DEFAULT NULL, `INDEX_FNAI_VALSM` smallint(6) DEFAULT NULL, `ORIGINE` varchar(2) DEFAULT NULL, `REF` varchar(50) DEFAULT NULL, `EDITION` text, `AANAIS` varchar(10) DEFAULT NULL, `COPAMA` decimal(2,1) DEFAULT NULL, `COPAFN` decimal(2,1) DEFAULT NULL, `NBGECO` decimal(3,1) DEFAULT NULL, `IdPAYS` int(11) DEFAULT NULL, `COPAIP` text, `NUNATI` text, PRIMARY KEY (`IdANIMAL`), KEY `IdPAYS` (`IdPAYS`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8;

And this is the line where the script is blocking
INSERT INTO `animal_2` (`IdANIMAL`, `LAST_DATE`, `IdCHEPTEL`, `NOBOVIC`, `NOBOVIL`, `NOBOET`, `M`, `AGMMP`, `AGMPP`, `AGMPM`, `AGMMM`, `AGPMP`, `AGPPP`, `AGPPM`, `AGPMM`, `GPP`, `GMP`, `GMM`, `GPM`, `NOM_GPM`, `P`, `NOM_P`, `INDEX_VVEL_ORI`, `INDEX_EC_ORI`, `INDEX_LT_ORI`, `INDEX_FERG_CD`, `INDEX_IA_VAL31`, `INDEX_REPRO_VAL31`, `INDEX_INEL_CD`, `INDEX_PC_VAL31`, `INDEX_FERV_ORI`, `INDEX_MACL_CD`, `INDEX_LP_ORI`, `INDEX_ME_VAL31`, `INDEX_PI_ORI`, `INDEX_FNAI_ORI`, `INDEX_TB_ORI`, `INDEX_MU_VAL3`, `INDEX_FVEL_CD`, `INDEX_LAIT_ORI`, `INDEX_MA_VAL31`, `INDEX_VNAI_CD`, `INDEX_CC_VAL31`, `INDEX_FVEL_ORI`, `INDEX_INEL_VAL3`, `INDEX_LAIT_VAL4`, `INDEX_TP_ORI`, `INDEX_MO_CD`, `INDEX_MO_REF`, `INDEX_AS_VAL31`, `INDEX_TE_VAL31`, `INDEX_MACL_ORI`, `INDEX_HS_ORI`, `INDEX_VVEL_VALSM`, `INDEX_TB_VAL31`, `INDEX_FERV_CD`, `INDEX_INEL_ORI`, `INDEX_IVIA1_VAL31`, `INDEX_IVIA1_CD`, `INDEX_STMA_ORI`, `INDEX_HS_VAL31`, `INDEX_LGF_VAL31`, `INDEX_TE_ORI`, `INDEX_MO_NF`, `INDEX_IB_ORI`, `INDEX_PS_ORI`, `INDEX_FERG_ORI`, `INDEX_LGF_CD`, `INDEX_LAIT_NF`, `INDEX_AJ_VAL31`, `INDEX_ISU_REF`, `INDEX_MP_ORI`, `INDEX_EQ_VAL31`, `INDEX_AH_VAL31`, `INDEX_LO_ORI`, `INDEX_CELL_ORI`, `INDEX_AS_ORI`, `INDEX_LT_VAL31`, `INDEX_CC_ORI`, `INDEX_REPRO_ORI`, `INDEX_ME_ORI`, `INDEX_IS_ORI`, `INDEX_FERV_VAL31`, `INDEX_AA_ORI`, `INDEX_VNAI_VALSM`, `INDEX_MG_VAL3`, `INDEX_AH_ORI`, `INDEX_LO_VAL31`, `INDEX_EA_ORI`, `INDEX_LP_VAL31`, `INDEX_FNAI_CD`, `INDEX_MACL_VAL31`, `INDEX_IS_VAL31`, `INDEX_ISU_VAL3`, `INDEX_AA_VAL31`, `INDEX_IA_ORI`, `INDEX_STMA_VAL31`, `INDEX_MA_ORI`, `INDEX_TR_ORI`, `INDEX_PS_VAL31`, `INDEX_CELL_VAL31`, `INDEX_MO_ORI`, `INDEX_LAIT_REF`, `INDEX_FVEL_VALSM`, `INDEX_EC_VAL31`, `INDEX_VVEL_CD`, `INDEX_MR_VAL31`, `INDEX_EA_VAL31`, `INDEX_MR_ORI`, `INDEX_PC_ORI`, `INDEX_CELL_CD`, `INDEX_FERG_VAL31`, `INDEX_PJ_ORI`, `INDEX_ISU_ORI`, `INDEX_MP_VAL3`, `INDEX_PJ_VAL31`, `INDEX_MO_VAL31`, `INDEX_TP_VAL31`, `INDEX_IVIA1_ORI`, `INDEX_MG_ORI`, `INDEX_LGF_ORI`, `INDEX_TR_VAL31`, `INDEX_PI_VAL31`, `INDEX_VNAI_ORI`, `INDEX_IB_VAL31`, `INDEX_AJ_ORI`, `INDEX_MU_ORI`, `INDEX_FNAI_VALSM`, `ORIGINE`, `REF`, `EDITION`, `NUIETA`, `AANAIS`, `COPAMA`, `COPAFN`, `NBGECO`, `IdPAYS`, `COPAIP`, `NUNATI`, `POINTS_FORTS`, `POINTS_FAIBLES`) VALUES ('US000135746776', NULL, NULL, 'MAN-O-MAN', 'LONG-LANGS OMAN OMAN-ET', NULL, 'FRU130677626', 'FRUS14889171', 'FRUS13066986', 'FRUS13751565', 'FRUS14962609', 'FRUS02071864', 'FRUS01986164', 'FRUS01912270', 'FRUS02103297', 'FRUS02183007', 'FRUS15459080', 'FRUS15930909', 'FRUS02265005', 'AARON', 'FRU122358313', 'O-MAN JUST', NULL, NULL, NULL, 95, 1.2, 1.2, 95, 0.2, NULL, 95, NULL, -0.4, NULL, NULL, NULL, 42, 95, NULL, 0.1, 95, -0.1, NULL, 27, 175, NULL, 95, NULL, 0.5, -0.1, NULL, NULL, 93, 1.8, 95, NULL, 0.9, 95, NULL, 0.8, -0.8, NULL, 1482, NULL, NULL, NULL, 95, 1798, -0.2, NULL, NULL, -0.1, 0.9, NULL, NULL, NULL, 0.0, NULL, NULL, NULL, NULL, 1.1, NULL, 93, 21, NULL, -0.2, NULL, -0.1, 95, -0.8, -0.4, 121, -1.5, NULL, -0.8, NULL, NULL, 1.8, -0.7, NULL, NULL, 93, -0.5, 99, -0.8, 0.0, NULL, NULL, 95, 0.2, NULL, NULL, 21, 0.2, -0.1, 1.8, NULL, NULL, NULL, 0.4, -0.2, NULL, -0.9, NULL, NULL, 92, 'EF', NULL, 'a:31:{i:0;s:11:"FRA 08/2019";i:1;s:11:"FRA 04/2019";i:2;s:11:"FRA 12/2018";i:3;s:11:"FRA 12/2018";i:4;s:11:"FRA 08/2018";i:5;s:11:"FRA 04/2018";i:6;s:11:"FRA 12/2017";i:7;s:11:"FRA 08/2017";i:8;s:11:"FRA 04/2017";i:9;s:11:"FRA 12/2016";i:10;s:11:"FRA 08/2016";i:11;s:11:"FRA 04/2016";i:12;s:11:"FRA 12/2015";i:13;s:11:"FRA 08/2015";i:14;s:11:"FRA 04/2015";i:15;s:11:"FRA 12/2014";i:16;s:11:"FRA 08/2014";i:17;s:11:"FRA 04/2014";i:18;s:11:"FRA 10/2013";i:19;s:11:"FRA 06/2013";i:20;s:11:"INT 04/2013";i:21;s:11:"INT 12/2012";i:22;s:11:"INT 08/2012";i:23;s:11:"INT 04/2012";i:24;s:11:"INT 12/2011";i:25;s:11:"INT 08/2011";i:26;s:11:"INT 04/2011";i:27;s:11:"INT 12/2010";i:28;s:11:"INT 08/2010";i:29;s:11:"INT 04/2010";i:30;s:11:"INT 01/2010";}', 75694, '2004-04-30', NULL, NULL, NULL, NULL, 'US', '000135746776', 'AJ,HS,TP', 'AA,IA,IS,LP,MO,MR,PI,STMA');

Thanks for your help ! :)

Views, Materialized Views, Functions not exported from Postgres database to SQLite file

I am trying to export a few schemas from my postgres database to SQLite as below:

/home/admin/.local/bin/db-to-sqlite "postgresql://postgres:[email protected]:5432/configuration" cplabor.db --all --postgres-schema dbd_cp_labor

This particular schema has views , materialized views , functions and datatypes as well apart from tables. However when I run -

❯ sqlite3 cplabor.db
SQLite version 3.27.2 2019-02-25 16:06:06
Enter ".help" for usage hints.
sqlite> Select * from dbd_cp_labor.vw_revenue_by_year;
Error: no such table: dbd_cp_labor.vw_revenue_by_year
sqlite> Select * from vw_revenue_by_year;
Error: no such table: vw_revenue_by_year

I am able to see only the tables and not the views or functions. Could I be missing something here. Please guide.

Is it possible to export from a SQLite database?

I'm running a SQLite database in memory, with URL sqlite://.

While the SQLite database is still open, invoking the following code:

db-to-sqlite sqlite:// database_copy.db --all --no-index-fks

... produces an output file database_copy.db which is not recognized as a SQLite database.

I'm using db-to-sqlite version 1.1.5.

Constraint Issue

Hello,
when i run following command:
db-to-sqlite "postgresql://username:password@localhost/databaseName" databaseName.db \
--all
It dump my database with all table but it does not copy all constraints. How can i copy all my constraints like unique, check, foreign key and all others?

I am running it on ubuntu 20.04

SQLite version supporting the RETURNING clause

Thank you for making such a handy tool. Recently, I ran into a problem. The SQLite version generated by the tool (3.12.99) doesn't support the RETURNING clause. Is there a version that supports at least 3.35.0? Thank you for your time.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.