Code Monkey home page Code Monkey logo

dbt-starrocks's Introduction

Download | Docs | Benchmarks | Demo

JAVA&C++ Commit Activities Open Issues Website Slack Twitter

StarRocks, a Linux Foundation project, is the next-generation data platform designed to make data-intensive real-time analytics fast and easy. It delivers query speeds 5 to 10 times faster than other popular solutions. StarRocks can perform real-time analytics well while updating historical records. It can also enhance real-time analytics with historical data from data lakes easily. With StarRocks, you can get rid of the de-normalized tables and get the best performance and flexibility.

Learn more 👉🏻 What Is StarRocks: Features and Use Cases



Features

  • 🚀 Native vectorized SQL engine: StarRocks adopts vectorization technology to make full use of the parallel computing power of CPU, achieving sub-second query returns in multi-dimensional analyses, which is 5 to 10 times faster than previous systems.
  • 📊 Standard SQL: StarRocks supports ANSI SQL syntax (fully supported TPC-H and TPC-DS). It is also compatible with the MySQL protocol. Various clients and BI software can be used to access StarRocks.
  • 💡 Smart query optimization: StarRocks can optimize complex queries through CBO (Cost Based Optimizer). With a better execution plan, the data analysis efficiency will be greatly improved.
  • ⚡ Real-time update: The updated model of StarRocks can perform upsert/delete operations according to the primary key, and achieve efficient query while concurrent updates.
  • 🪟 Intelligent materialized view: The materialized view of StarRocks can be automatically updated during the data import and automatically selected when the query is executed.
  • ✨ Querying data in data lakes directly: StarRocks allows direct access to data from Apache Hive™, Apache Iceberg™, Delta Lake™ and Apache Hudi™ without importing.
  • 🎛️ Resource management: This feature allows StarRocks to limit resource consumption for queries and implement isolation and efficient use of resources among tenants in the same cluster.
  • 💠 Easy to maintain: Simple architecture makes StarRocks easy to deploy, maintain and scale out. StarRocks tunes its query plan agilely, balances the resources when the cluster is scaled in or out, and recovers the data replica under node failure automatically.

Architecture Overview

StarRocks’s streamlined architecture is mainly composed of two modules: Frontend (FE) and Backend (BE). The entire system eliminates single points of failure through seamless and horizontal scaling of FE and BE, as well as replication of metadata and data.

Starting from version 3.0, StarRocks supports a new shared-data architecture, which can provide better scalability and lower costs.


Resources

📚 Read the docs

Section Description
Quick Starts How-tos and Tutorials.
Deploy Learn how to run and configure StarRocks.
Docs Full documentation.
Blogs StarRocks deep dive and user stories.

❓ Get support


Contributing to StarRocks

We welcome all kinds of contributions from the community, individuals and partners. We owe our success to your active involvement.

  1. See Contributing.md to get started.
  2. Set up StarRocks development environment:
  1. Understand our GitHub workflow for opening a pull request; use this PR Template when submitting a pull request.
  2. Pick a good first issue and start contributing.

📝 License: StarRocks is licensed under Apache License 2.0.

👥 Community Membership: Learn more about different contributor roles in StarRocks community.


Used By

This project is used by the following companies. Learn more about their use cases:

dbt-starrocks's People

Contributors

alberttwong avatar astralidea avatar dan-j-d avatar hatlassian avatar imay avatar long2ice avatar motto1314 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

dbt-starrocks's Issues

Database Error in snapshot ts_snapshot

Invoking dbt with ['snapshot', '--vars', 'seed_name: added']
21:41:14  Running with dbt=1.6.2
21:41:14  Registered adapter: starrocks=1.4.2
21:41:14  Unable to do partial parsing because config vars, config profile, or config target have changed
21:41:14  Found 1 snapshot, 3 seeds, 0 sources, 0 exposures, 0 metrics, 335 macros, 0 groups, 0 semantic models
21:41:14  
21:41:14  Concurrency: 1 threads (target='default')
21:41:14  
21:41:14  1 of 1 START snapshot test17102796730728107077_test_basic.ts_snapshot .......... [RUN]
21:41:15  1 of 1 ERROR snapshotting test17102796730728107077_test_basic.ts_snapshot ...... [ERROR in 0.31s]
21:41:15  
21:41:15  Finished running 1 snapshot in 0 hours 0 minutes and 0.37 seconds (0.37s).
21:41:15  
21:41:15  Completed with 1 error and 0 warnings:
21:41:15  
21:41:15    Database Error in snapshot ts_snapshot (snapshots/ts_snapshot.sql)
  1064 (HY000): Getting syntax error at line 3, column 68. Detail message: Unexpected input ',', the most similar input is {'SET'}.
  compiled Code at target/run/snapshot_strategy_timestamp/snapshots/ts_snapshot.sql
21:41:15  
21:41:15  Done. PASS=0 WARN=0 ERROR=1 SKIP=0 TOTAL=1

upgrade to support dbt-core v1.3.0

Background

The latest release cut for 1.3.0, dbt-core==1.3.0rc2 was published on October 3, 2022 (PyPI | Github). We are targeting releasing the official cut of 1.3.0 in time for the week of October 16 (in time for Coalesce conference).

We're trying to establish a following precedent w.r.t. minor versions:
Partner adapter maintainers release their adapter's minor version within four weeks of the initial RC being released. Given the delay on our side in notifying you, we'd like to set a target date of November 7 (four weeks from today) for maintainers to release their minor version

Timeframe Date (intended) Date (Actual) Event
D - 3 weeks Sep 21 Oct 10 dbt Labs informs maintainers of upcoming minor release
D - 2 weeks Sep 28 Sep 28 core 1.3 RC is released
Day D October 12 Oct 12 core 1.3 official is published
D + 2 weeks October 26 Nov 7 dbt-adapter 1.3 is published

How to upgrade

dbt-labs/dbt-core#6011 is an open discussion with more detailed information, and dbt-labs/dbt-core#6040 is for keeping track of the community's progress on releasing 1.2.0

Below is a checklist of work that would enable a successful 1.2.0 release of your adapter.

  • Python Models (if applicable)
  • Incremental Materialization: cleanup and standardization
  • More functional adapter tests to inherit

Pass DBT basic unit test adapter suite

CLIENT: Server listening on port 65417...
Received JSON data in run script
Running pytest with args: ['-p', 'vscode_pytest', '--rootdir=/Users/atwong/sandbox/dbt-starrocks', '/Users/atwong/sandbox/dbt-starrocks/tests/functional/adapter/test_basic.py::TestSingularTestsMyAdapter::test_singular_tests', '/Users/atwong/sandbox/dbt-starrocks/tests/functional/adapter/test_basic.py::TestSingularTestsEphemeralMyAdapter::test_singular_tests_ephemeral', '/Users/atwong/sandbox/dbt-starrocks/tests/functional/adapter/test_basic.py::TestEmptyMyAdapter::test_empty', '/Users/atwong/sandbox/dbt-starrocks/tests/functional/adapter/test_basic.py::TestEphemeralMyAdapter::test_ephemeral', '/Users/atwong/sandbox/dbt-starrocks/tests/functional/adapter/test_basic.py::TestIncrementalMyAdapter::test_incremental', '/Users/atwong/sandbox/dbt-starrocks/tests/functional/adapter/test_basic.py::TestGenericTestsMyAdapter::test_generic_tests', '/Users/atwong/sandbox/dbt-starrocks/tests/functional/adapter/test_basic.py::TestSnapshotCheckColsMyAdapter::test_snapshot_check_cols', '/Users/atwong/sandbox/dbt-starrocks/tests/functional/adapter/test_basic.py::TestSnapshotTimestampMyAdapter::test_snapshot_timestamp', '/Users/atwong/sandbox/dbt-starrocks/tests/functional/adapter/test_basic.py::TestBaseAdapterMethod::test_adapter_methods']
============================= test session starts ==============================
platform darwin -- Python 3.9.16, pytest-8.1.1, pluggy-1.4.0
rootdir: /Users/atwong/sandbox/dbt-starrocks
configfile: pytest.ini
plugins: dotenv-0.5.2
collected 9 items

tests/functional/adapter/test_basic.py ......FF.                         [100%]

=================================== FAILURES ===================================
___________ TestSnapshotCheckColsMyAdapter.test_snapshot_check_cols ____________

self = <test_basic.TestSnapshotCheckColsMyAdapter object at 0x10ce45310>
project = <dbt.tests.fixtures.project.TestProjInfo object at 0x10d370c40>

    def test_snapshot_check_cols(self, project):
        # seed command
        results = run_dbt(["seed"])
        assert len(results) == 2
    
        # snapshot command
>       results = run_dbt(["snapshot"])

/Users/atwong/dbt-env/lib/python3.9/site-packages/dbt/tests/adapter/basic/test_snapshot_check_cols.py:44: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = ['snapshot', '--project-dir', '/private/var/folders/xn/_h8gfpfs3vv_m6gkxm8t0l380000gn/T/pytest-of-atwong/pytest-97/project6', '--profiles-dir', '/private/var/folders/xn/_h8gfpfs3vv_m6gkxm8t0l380000gn/T/pytest-of-atwong/pytest-97/profile6']
expect_pass = True

    def run_dbt(
        args: Optional[List[str]] = None,
        expect_pass: bool = True,
    ):
        # Ignore logbook warnings
        warnings.filterwarnings("ignore", category=DeprecationWarning, module="logbook")
    
        # reset global vars
        reset_metadata_vars()
    
        # The logger will complain about already being initialized if
        # we don't do this.
        log_manager.reset_handlers()
        if args is None:
            args = ["run"]
    
        print("\n\nInvoking dbt with {}".format(args))
        from dbt.flags import get_flags
    
        flags = get_flags()
        project_dir = getattr(flags, "PROJECT_DIR", None)
        profiles_dir = getattr(flags, "PROFILES_DIR", None)
        if project_dir and "--project-dir" not in args:
            args.extend(["--project-dir", project_dir])
        if profiles_dir and "--profiles-dir" not in args:
            args.extend(["--profiles-dir", profiles_dir])
    
        dbt = dbtRunner()
        res = dbt.invoke(args)
    
        # the exception is immediately raised to be caught in tests
        # using a pattern like `with pytest.raises(SomeException):`
        if res.exception is not None:
            raise res.exception
    
        if expect_pass is not None:
>           assert res.success == expect_pass, "dbt exit state did not match expected"
E           AssertionError: dbt exit state did not match expected

/Users/atwong/dbt-env/lib/python3.9/site-packages/dbt/tests/util.py:108: AssertionError
---------------------------- Captured stdout setup -----------------------------

=== Test project_root: /private/var/folders/xn/_h8gfpfs3vv_m6gkxm8t0l380000gn/T/pytest-of-atwong/pytest-97/project6
----------------------------- Captured stdout call -----------------------------


Invoking dbt with ['seed']
23:40:48  Running with dbt=1.6.2
23:40:48  Registered adapter: starrocks=1.4.2
23:40:48  Unable to do partial parsing because saved manifest not found. Starting full parse.
23:40:48  Found 3 snapshots, 2 seeds, 0 sources, 0 exposures, 0 metrics, 335 macros, 0 groups, 0 semantic models
23:40:48  
23:40:49  Concurrency: 1 threads (target='default')
23:40:49  
23:40:49  1 of 2 START seed file test17102868486143503769_test_basic.added ............... [RUN]
23:40:49  1 of 2 OK loaded seed file test17102868486143503769_test_basic.added ........... [INSERT 20 in 0.22s]
23:40:49  2 of 2 START seed file test17102868486143503769_test_basic.base ................ [RUN]
23:40:49  2 of 2 OK loaded seed file test17102868486143503769_test_basic.base ............ [INSERT 10 in 0.19s]
23:40:49  
23:40:49  Finished running 2 seeds in 0 hours 0 minutes and 0.47 seconds (0.47s).
23:40:49  
23:40:49  Completed successfully
23:40:49  
23:40:49  Done. PASS=2 WARN=0 ERROR=0 SKIP=0 TOTAL=2


Invoking dbt with ['snapshot']
23:40:49  Running with dbt=1.6.2
23:40:49  Registered adapter: starrocks=1.4.2
23:40:49  Found 3 snapshots, 2 seeds, 0 sources, 0 exposures, 0 metrics, 335 macros, 0 groups, 0 semantic models
23:40:49  
23:40:49  Concurrency: 1 threads (target='default')
23:40:49  
23:40:49  1 of 3 START snapshot test17102868486143503769_test_basic.cc_all_snapshot ...... [RUN]
23:40:49  1 of 3 ERROR snapshotting test17102868486143503769_test_basic.cc_all_snapshot .. [ERROR in 0.04s]
23:40:49  2 of 3 START snapshot test17102868486143503769_test_basic.cc_date_snapshot ..... [RUN]
23:40:49  2 of 3 ERROR snapshotting test17102868486143503769_test_basic.cc_date_snapshot . [ERROR in 0.01s]
23:40:49  3 of 3 START snapshot test17102868486143503769_test_basic.cc_name_snapshot ..... [RUN]
23:40:49  3 of 3 ERROR snapshotting test17102868486143503769_test_basic.cc_name_snapshot . [ERROR in 0.04s]
23:40:49  
23:40:49  Finished running 3 snapshots in 0 hours 0 minutes and 0.14 seconds (0.14s).
23:40:49  
23:40:49  Completed with 3 errors and 0 warnings:
23:40:49  
23:40:49    Compilation Error in snapshot cc_all_snapshot (snapshots/cc_all_snapshot.sql)
  macro 'dbt_macro__snapshot_check_all_get_existing_columns' takes not more than 2 argument(s)
  
  > in macro snapshot_check_strategy (macros/materializations/snapshots/strategies.sql)
  > called by macro materialization_snapshot_starrocks (macros/materializations/snapshot/snapshot.sql)
  > called by snapshot cc_all_snapshot (snapshots/cc_all_snapshot.sql)
23:40:49  
23:40:49    Compilation Error in snapshot cc_date_snapshot (snapshots/cc_date_snapshot.sql)
  macro 'dbt_macro__snapshot_check_all_get_existing_columns' takes not more than 2 argument(s)
  
  > in macro snapshot_check_strategy (macros/materializations/snapshots/strategies.sql)
  > called by macro materialization_snapshot_starrocks (macros/materializations/snapshot/snapshot.sql)
  > called by snapshot cc_date_snapshot (snapshots/cc_date_snapshot.sql)
23:40:49  
23:40:49    Compilation Error in snapshot cc_name_snapshot (snapshots/cc_name_snapshot.sql)
  macro 'dbt_macro__snapshot_check_all_get_existing_columns' takes not more than 2 argument(s)
  
  > in macro snapshot_check_strategy (macros/materializations/snapshots/strategies.sql)
  > called by macro materialization_snapshot_starrocks (macros/materializations/snapshot/snapshot.sql)
  > called by snapshot cc_name_snapshot (snapshots/cc_name_snapshot.sql)
23:40:49  
23:40:49  Done. PASS=0 WARN=0 ERROR=3 SKIP=0 TOTAL=3
____________ TestSnapshotTimestampMyAdapter.test_snapshot_timestamp ____________

self = <test_basic.TestSnapshotTimestampMyAdapter object at 0x10ce45bb0>
project = <dbt.tests.fixtures.project.TestProjInfo object at 0x10d3c5970>

    def test_snapshot_timestamp(self, project):
        # seed command
        results = run_dbt(["seed"])
        assert len(results) == 3
    
        # snapshot command
        results = run_dbt(["snapshot"])
        assert len(results) == 1
    
        # snapshot has 10 rows
        check_relation_rows(project, "ts_snapshot", 10)
    
        # point at the "added" seed so the snapshot sees 10 new rows
>       results = run_dbt(["-d", "snapshot", "--vars", "seed_name: added"])

/Users/atwong/dbt-env/lib/python3.9/site-packages/dbt/tests/adapter/basic/test_snapshot_timestamp.py:49: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = ['snapshot', '--vars', 'seed_name: added', '--project-dir', '/private/var/folders/xn/_h8gfpfs3vv_m6gkxm8t0l380000gn/T/pytest-of-atwong/pytest-97/project7', '--profiles-dir', ...]
expect_pass = True

    def run_dbt(
        args: Optional[List[str]] = None,
        expect_pass: bool = True,
    ):
        # Ignore logbook warnings
        warnings.filterwarnings("ignore", category=DeprecationWarning, module="logbook")
    
        # reset global vars
        reset_metadata_vars()
    
        # The logger will complain about already being initialized if
        # we don't do this.
        log_manager.reset_handlers()
        if args is None:
            args = ["run"]
    
        print("\n\nInvoking dbt with {}".format(args))
        from dbt.flags import get_flags
    
        flags = get_flags()
        project_dir = getattr(flags, "PROJECT_DIR", None)
        profiles_dir = getattr(flags, "PROFILES_DIR", None)
        if project_dir and "--project-dir" not in args:
            args.extend(["--project-dir", project_dir])
        if profiles_dir and "--profiles-dir" not in args:
            args.extend(["--profiles-dir", profiles_dir])
    
        dbt = dbtRunner()
        res = dbt.invoke(args)
    
        # the exception is immediately raised to be caught in tests
        # using a pattern like `with pytest.raises(SomeException):`
        if res.exception is not None:
            raise res.exception
    
        if expect_pass is not None:
>           assert res.success == expect_pass, "dbt exit state did not match expected"
E           AssertionError: dbt exit state did not match expected

/Users/atwong/dbt-env/lib/python3.9/site-packages/dbt/tests/util.py:108: AssertionError
---------------------------- Captured stdout setup -----------------------------

=== Test project_root: /private/var/folders/xn/_h8gfpfs3vv_m6gkxm8t0l380000gn/T/pytest-of-atwong/pytest-97/project7
----------------------------- Captured stdout call -----------------------------


Invoking dbt with ['seed']
23:40:49  Running with dbt=1.6.2
23:40:49  Registered adapter: starrocks=1.4.2
23:40:49  Unable to do partial parsing because saved manifest not found. Starting full parse.
23:40:50  Found 1 snapshot, 3 seeds, 0 sources, 0 exposures, 0 metrics, 335 macros, 0 groups, 0 semantic models
23:40:50  
23:40:50  Concurrency: 1 threads (target='default')
23:40:50  
23:40:50  1 of 3 START seed file test17102868504212522810_test_basic.added ............... [RUN]
23:40:50  1 of 3 OK loaded seed file test17102868504212522810_test_basic.added ........... [INSERT 20 in 0.24s]
23:40:50  2 of 3 START seed file test17102868504212522810_test_basic.base ................ [RUN]
23:40:50  2 of 3 OK loaded seed file test17102868504212522810_test_basic.base ............ [INSERT 10 in 0.19s]
23:40:50  3 of 3 START seed file test17102868504212522810_test_basic.newcolumns .......... [RUN]
23:40:51  3 of 3 OK loaded seed file test17102868504212522810_test_basic.newcolumns ...... [INSERT 10 in 0.21s]
23:40:51  
23:40:51  Finished running 3 seeds in 0 hours 0 minutes and 0.69 seconds (0.69s).
23:40:51  
23:40:51  Completed successfully
23:40:51  
23:40:51  Done. PASS=3 WARN=0 ERROR=0 SKIP=0 TOTAL=3


Invoking dbt with ['snapshot']
23:40:51  Running with dbt=1.6.2
23:40:51  Registered adapter: starrocks=1.4.2
23:40:51  Found 1 snapshot, 3 seeds, 0 sources, 0 exposures, 0 metrics, 335 macros, 0 groups, 0 semantic models
23:40:51  
23:40:51  Concurrency: 1 threads (target='default')
23:40:51  
23:40:51  1 of 1 START snapshot test17102868504212522810_test_basic.ts_snapshot .......... [RUN]
23:40:51  1 of 1 OK snapshotted test17102868504212522810_test_basic.ts_snapshot .......... [SUCCESS 10 in 0.20s]
23:40:51  
23:40:51  Finished running 1 snapshot in 0 hours 0 minutes and 0.24 seconds (0.24s).
23:40:51  
23:40:51  Completed successfully
23:40:51  
23:40:51  Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1


Invoking dbt with ['-d', 'snapshot', '--vars', 'seed_name: added']
23:40:51  Running with dbt=1.6.2
23:40:51  running dbt with arguments {'printer_width': '80', 'indirect_selection': 'eager', 'log_cache_events': 'False', 'write_json': 'True', 'partial_parse': 'True', 'cache_selected_only': 'False', 'profiles_dir': '/private/var/folders/xn/_h8gfpfs3vv_m6gkxm8t0l380000gn/T/pytest-of-atwong/pytest-97/profile7', 'version_check': 'True', 'warn_error': 'None', 'log_path': '/Users/atwong/sandbox/dbt-starrocks/logs/test17102868504212522810', 'debug': 'True', 'fail_fast': 'False', 'use_colors': 'True', 'use_experimental_parser': 'False', 'no_print': 'None', 'quiet': 'False', 'log_format': 'default', 'invocation_command': 'dbt --rootdir=/Users/atwong/sandbox/dbt-starrocks', 'warn_error_options': 'WarnErrorOptions(include=[], exclude=[])', 'static_parser': 'True', 'target_path': 'None', 'introspect': 'True', 'send_anonymous_usage_stats': 'False'}
23:40:51  Connection '_test' was properly closed.
23:40:51  Registered adapter: starrocks=1.4.2
23:40:51  checksum: 28af41976244efb18c98fa3cfba69f85a75f3346c2f9b2bb706d0954f8ab0ad6, vars: {'seed_name': 'added'}, profile: , target: , version: 1.6.2
23:40:51  Unable to do partial parsing because config vars, config profile, or config target have changed
23:40:51  previous checksum: 28af41976244efb18c98fa3cfba69f85a75f3346c2f9b2bb706d0954f8ab0ad6, current checksum: 8d66c46a7c5c0e5717f369273dbde9db64740684d9ba42e6442c1808d698ae9f
23:40:51  Found 1 snapshot, 3 seeds, 0 sources, 0 exposures, 0 metrics, 335 macros, 0 groups, 0 semantic models
23:40:51  
23:40:51  Acquiring new starrocks connection 'master'
23:40:51  Acquiring new starrocks connection 'list_schemas'
23:40:51  Using starrocks connection "list_schemas"
23:40:51  On list_schemas: /* {"app": "dbt", "dbt_version": "1.6.2", "profile_name": "test", "target_name": "default", "connection_name": "list_schemas"} */
select distinct schema_name from information_schema.schemata
23:40:51  Opening a new connection, currently in state init
23:40:51  SQL status: SUCCESS 8 in 0.0 seconds
23:40:51  On list_schemas: Close
23:40:51  Re-using an available connection from the pool (formerly list_schemas, now list_None_test17102868504212522810_test_basic)
23:40:51  Using starrocks connection "list_None_test17102868504212522810_test_basic"
23:40:51  On list_None_test17102868504212522810_test_basic: BEGIN
23:40:51  Opening a new connection, currently in state closed
23:40:51  SQL status: SUCCESS 0 in 0.0 seconds
23:40:51  Using starrocks connection "list_None_test17102868504212522810_test_basic"
23:40:51  On list_None_test17102868504212522810_test_basic: /* {"app": "dbt", "dbt_version": "1.6.2", "profile_name": "test", "target_name": "default", "connection_name": "list_None_test17102868504212522810_test_basic"} */

    select
      null as "database",
      tbl.table_name as name,
      tbl.table_schema as "schema",
      case when tbl.table_type = 'BASE TABLE' then 'table'
           when tbl.table_type = 'VIEW' and mv.table_name is null then 'view'
           when tbl.table_type = 'VIEW' and mv.table_name is not null then 'materialized_view'
           when tbl.table_type = 'SYSTEM VIEW' then 'system_view'
           else 'unknown' end as table_type
    from information_schema.tables tbl
    left join information_schema.materialized_views mv
    on tbl.TABLE_SCHEMA = mv.TABLE_SCHEMA
    and tbl.TABLE_NAME = mv.TABLE_NAME
    where tbl.table_schema = 'test17102868504212522810_test_basic'
  
23:40:51  SQL status: SUCCESS 4 in 0.0 seconds
23:40:51  On list_None_test17102868504212522810_test_basic: ROLLBACK
23:40:51  On list_None_test17102868504212522810_test_basic: Close
23:40:51  Using starrocks connection "master"
23:40:51  On master: BEGIN
23:40:51  Opening a new connection, currently in state init
23:40:51  SQL status: SUCCESS 0 in 0.0 seconds
23:40:51  On master: COMMIT
23:40:51  Using starrocks connection "master"
23:40:51  On master: COMMIT
23:40:51  SQL status: SUCCESS 0 in 0.0 seconds
23:40:51  On master: Close
23:40:51  Concurrency: 1 threads (target='default')
23:40:51  
23:40:51  Began running node snapshot.snapshot_strategy_timestamp.ts_snapshot
23:40:51  1 of 1 START snapshot test17102868504212522810_test_basic.ts_snapshot .......... [RUN]
23:40:51  Re-using an available connection from the pool (formerly list_None_test17102868504212522810_test_basic, now snapshot.snapshot_strategy_timestamp.ts_snapshot)
23:40:51  Began compiling node snapshot.snapshot_strategy_timestamp.ts_snapshot
23:40:51  Timing info for snapshot.snapshot_strategy_timestamp.ts_snapshot (compile): 16:40:51.852241 => 16:40:51.854489
23:40:51  Began executing node snapshot.snapshot_strategy_timestamp.ts_snapshot
23:40:51  Using starrocks connection "snapshot.snapshot_strategy_timestamp.ts_snapshot"
23:40:51  On snapshot.snapshot_strategy_timestamp.ts_snapshot: /* {"app": "dbt", "dbt_version": "1.6.2", "profile_name": "test", "target_name": "default", "node_id": "snapshot.snapshot_strategy_timestamp.ts_snapshot"} */
select distinct schema_name from information_schema.schemata
23:40:51  Opening a new connection, currently in state closed
23:40:51  SQL status: SUCCESS 8 in 0.0 seconds
23:40:51  Using starrocks connection "snapshot.snapshot_strategy_timestamp.ts_snapshot"
23:40:51  On snapshot.snapshot_strategy_timestamp.ts_snapshot: BEGIN
23:40:51  SQL status: SUCCESS 0 in 0.0 seconds
23:40:51  Using starrocks connection "snapshot.snapshot_strategy_timestamp.ts_snapshot"
23:40:51  On snapshot.snapshot_strategy_timestamp.ts_snapshot: /* {"app": "dbt", "dbt_version": "1.6.2", "profile_name": "test", "target_name": "default", "node_id": "snapshot.snapshot_strategy_timestamp.ts_snapshot"} */

    select
        column_name,
        data_type,
        character_maximum_length,
        numeric_precision,
        numeric_scale

    from INFORMATION_SCHEMA.columns
    where table_name = 'ts_snapshot'
      
      and table_schema = 'test17102868504212522810_test_basic'
      
    order by ordinal_position

  
23:40:51  SQL status: SUCCESS 7 in 0.0 seconds
23:40:51  Using starrocks connection "snapshot.snapshot_strategy_timestamp.ts_snapshot"
23:40:51  On snapshot.snapshot_strategy_timestamp.ts_snapshot: /* {"app": "dbt", "dbt_version": "1.6.2", "profile_name": "test", "target_name": "default", "node_id": "snapshot.snapshot_strategy_timestamp.ts_snapshot"} */

        
  
    

  create table `test17102868504212522810_test_basic`.`ts_snapshot__dbt_tmp`
    PROPERTIES (
      "replication_num" = "1"
    )
  as with snapshot_query as (

        
    
    select * from `test17102868504212522810_test_basic`.`added`

    ),

    snapshotted_data as (

        select *,
            id as dbt_unique_key

        from `test17102868504212522810_test_basic`.`ts_snapshot`
        where dbt_valid_to is null

    ),

    insertions_source_data as (

        select
            *,
            id as dbt_unique_key,
            some_date as dbt_updated_at,
            some_date as dbt_valid_from,
            nullif(some_date, some_date) as dbt_valid_to,
            md5(concat_ws('|',coalesce(cast(id as char), '')
        , coalesce(cast(some_date as char), '')
        )) as dbt_scd_id

        from snapshot_query
    ),

    updates_source_data as (

        select
            *,
            id as dbt_unique_key,
            some_date as dbt_updated_at,
            some_date as dbt_valid_from,
            some_date as dbt_valid_to

        from snapshot_query
    ),

    insertions as (

        select
            'insert' as dbt_change_type,
            source_data.*

        from insertions_source_data as source_data
        left outer join snapshotted_data on snapshotted_data.dbt_unique_key = source_data.dbt_unique_key
        where snapshotted_data.dbt_unique_key is null
           or (
                snapshotted_data.dbt_unique_key is not null
            and (
                (snapshotted_data.dbt_valid_from < source_data.some_date)
            )
        )

    ),

    updates as (

        select
            'update' as dbt_change_type,
            source_data.*,
            snapshotted_data.dbt_scd_id

        from updates_source_data as source_data
        join snapshotted_data on snapshotted_data.dbt_unique_key = source_data.dbt_unique_key
        where (
            (snapshotted_data.dbt_valid_from < source_data.some_date)
        )
    )

    select * from insertions
    union all
    select * from updates

  
    
23:40:52  SQL status: SUCCESS 10 in 0.0 seconds
23:40:52  Using starrocks connection "snapshot.snapshot_strategy_timestamp.ts_snapshot"
23:40:52  On snapshot.snapshot_strategy_timestamp.ts_snapshot: /* {"app": "dbt", "dbt_version": "1.6.2", "profile_name": "test", "target_name": "default", "node_id": "snapshot.snapshot_strategy_timestamp.ts_snapshot"} */

    select
        column_name,
        data_type,
        character_maximum_length,
        numeric_precision,
        numeric_scale

    from INFORMATION_SCHEMA.columns
    where table_name = 'ts_snapshot__dbt_tmp'
      
      and table_schema = 'test17102868504212522810_test_basic'
      
    order by ordinal_position

  
23:40:52  SQL status: SUCCESS 9 in 0.0 seconds
23:40:52  Using starrocks connection "snapshot.snapshot_strategy_timestamp.ts_snapshot"
23:40:52  On snapshot.snapshot_strategy_timestamp.ts_snapshot: /* {"app": "dbt", "dbt_version": "1.6.2", "profile_name": "test", "target_name": "default", "node_id": "snapshot.snapshot_strategy_timestamp.ts_snapshot"} */

    select
        column_name,
        data_type,
        character_maximum_length,
        numeric_precision,
        numeric_scale

    from INFORMATION_SCHEMA.columns
    where table_name = 'ts_snapshot'
      
      and table_schema = 'test17102868504212522810_test_basic'
      
    order by ordinal_position

  
23:40:52  SQL status: SUCCESS 7 in 0.0 seconds
23:40:52  Using starrocks connection "snapshot.snapshot_strategy_timestamp.ts_snapshot"
23:40:52  On snapshot.snapshot_strategy_timestamp.ts_snapshot: /* {"app": "dbt", "dbt_version": "1.6.2", "profile_name": "test", "target_name": "default", "node_id": "snapshot.snapshot_strategy_timestamp.ts_snapshot"} */

    select
        column_name,
        data_type,
        character_maximum_length,
        numeric_precision,
        numeric_scale

    from INFORMATION_SCHEMA.columns
    where table_name = 'ts_snapshot__dbt_tmp'
      
      and table_schema = 'test17102868504212522810_test_basic'
      
    order by ordinal_position

  
23:40:52  SQL status: SUCCESS 9 in 0.0 seconds
23:40:52  Using starrocks connection "snapshot.snapshot_strategy_timestamp.ts_snapshot"
23:40:52  On snapshot.snapshot_strategy_timestamp.ts_snapshot: /* {"app": "dbt", "dbt_version": "1.6.2", "profile_name": "test", "target_name": "default", "node_id": "snapshot.snapshot_strategy_timestamp.ts_snapshot"} */

    select
        column_name,
        data_type,
        character_maximum_length,
        numeric_precision,
        numeric_scale

    from INFORMATION_SCHEMA.columns
    where table_name = 'ts_snapshot'
      
      and table_schema = 'test17102868504212522810_test_basic'
      
    order by ordinal_position

  
23:40:52  SQL status: SUCCESS 7 in 0.0 seconds
23:40:52  Using starrocks connection "snapshot.snapshot_strategy_timestamp.ts_snapshot"
23:40:52  On snapshot.snapshot_strategy_timestamp.ts_snapshot: /* {"app": "dbt", "dbt_version": "1.6.2", "profile_name": "test", "target_name": "default", "node_id": "snapshot.snapshot_strategy_timestamp.ts_snapshot"} */

    select
        column_name,
        data_type,
        character_maximum_length,
        numeric_precision,
        numeric_scale

    from INFORMATION_SCHEMA.columns
    where table_name = 'ts_snapshot__dbt_tmp'
      
      and table_schema = 'test17102868504212522810_test_basic'
      
    order by ordinal_position

  
23:40:52  SQL status: SUCCESS 9 in 0.0 seconds
23:40:52  Writing runtime sql for node "snapshot.snapshot_strategy_timestamp.ts_snapshot"
23:40:52  Using starrocks connection "snapshot.snapshot_strategy_timestamp.ts_snapshot"
23:40:52  On snapshot.snapshot_strategy_timestamp.ts_snapshot: /* {"app": "dbt", "dbt_version": "1.6.2", "profile_name": "test", "target_name": "default", "node_id": "snapshot.snapshot_strategy_timestamp.ts_snapshot"} */

          update `test17102868504212522810_test_basic`.`ts_snapshot`, (select dbt_scd_id, dbt_change_type, dbt_valid_to from `test17102868504212522810_test_basic`.`ts_snapshot__dbt_tmp`) as DBT_INTERNAL_SOURCE
    set `test17102868504212522810_test_basic`.`ts_snapshot`.dbt_valid_to = DBT_INTERNAL_SOURCE.dbt_valid_to
    where DBT_INTERNAL_SOURCE.dbt_scd_id = `test17102868504212522810_test_basic`.`ts_snapshot`.dbt_scd_id
    and DBT_INTERNAL_SOURCE.dbt_change_type = 'update'
    and `test17102868504212522810_test_basic`.`ts_snapshot`.dbt_valid_to is null

      
23:40:52  starrocks adapter: StarRocks error: 1064 (HY000): Getting syntax error at line 3, column 68. Detail message: Unexpected input ',', the most similar input is {'SET'}.
23:40:52  On snapshot.snapshot_strategy_timestamp.ts_snapshot: ROLLBACK
23:40:52  Timing info for snapshot.snapshot_strategy_timestamp.ts_snapshot (execute): 16:40:51.855178 => 16:40:52.161686
23:40:52  On snapshot.snapshot_strategy_timestamp.ts_snapshot: Close
23:40:52  Database Error in snapshot ts_snapshot (snapshots/ts_snapshot.sql)
  1064 (HY000): Getting syntax error at line 3, column 68. Detail message: Unexpected input ',', the most similar input is {'SET'}.
  compiled Code at target/run/snapshot_strategy_timestamp/snapshots/ts_snapshot.sql
23:40:52  1 of 1 ERROR snapshotting test17102868504212522810_test_basic.ts_snapshot ...... [ERROR in 0.31s]
23:40:52  Finished running node snapshot.snapshot_strategy_timestamp.ts_snapshot
23:40:52  Using starrocks connection "master"
23:40:52  On master: BEGIN
23:40:52  Opening a new connection, currently in state closed
23:40:52  SQL status: SUCCESS 0 in 0.0 seconds
23:40:52  On master: COMMIT
23:40:52  Using starrocks connection "master"
23:40:52  On master: COMMIT
23:40:52  SQL status: SUCCESS 0 in 0.0 seconds
23:40:52  On master: Close
23:40:52  Connection 'master' was properly closed.
23:40:52  Connection 'snapshot.snapshot_strategy_timestamp.ts_snapshot' was properly closed.
23:40:52  
23:40:52  Finished running 1 snapshot in 0 hours 0 minutes and 0.38 seconds (0.38s).
23:40:52  Command end result
23:40:52  
23:40:52  Completed with 1 error and 0 warnings:
23:40:52  
23:40:52    Database Error in snapshot ts_snapshot (snapshots/ts_snapshot.sql)
  1064 (HY000): Getting syntax error at line 3, column 68. Detail message: Unexpected input ',', the most similar input is {'SET'}.
  compiled Code at target/run/snapshot_strategy_timestamp/snapshots/ts_snapshot.sql
23:40:52  
23:40:52  Done. PASS=0 WARN=0 ERROR=1 SKIP=0 TOTAL=1
23:40:52  Command `cli snapshot` failed at 16:40:52.181837 after 0.82 seconds
23:40:52  Flushing usage events
--------------------------- Captured stdout teardown ---------------------------
23:40:52  Acquiring new starrocks connection '_test'
23:40:52  Dropping schema "schema: "test17102868504212522810_test_basic"
".
23:40:52  Using starrocks connection "_test"
23:40:52  On _test: BEGIN
23:40:52  Opening a new connection, currently in state init
23:40:52  SQL status: SUCCESS 0 in 0.0 seconds
23:40:52  Using starrocks connection "_test"
23:40:52  On _test: /* {"app": "dbt", "dbt_version": "1.6.2", "profile_name": "test", "target_name": "default", "connection_name": "_test"} */

    drop schema if exists `test17102868504212522810_test_basic`
  
23:40:52  SQL status: SUCCESS 0 in 0.0 seconds
23:40:52  On _test: COMMIT
23:40:52  Using starrocks connection "_test"
23:40:52  On _test: COMMIT
23:40:52  SQL status: SUCCESS 0 in 0.0 seconds
23:40:52  On _test: Close
=========================== short test summary info ============================
FAILED tests/functional/adapter/test_basic.py::TestSnapshotCheckColsMyAdapter::test_snapshot_check_cols
FAILED tests/functional/adapter/test_basic.py::TestSnapshotTimestampMyAdapter::test_snapshot_timestamp
========================= 2 failed, 7 passed in 14.27s =========================
Finished running tests!

dbt --version says dbt-starrocks 1.4.2 is not compatible with dbt-core 1.6.2

(dbt-env) atwong@Albert-CelerData ~ % dbt  --version
Core:
  - installed: 1.6.2
  - latest:    1.7.7 - Update available!

  Your version of dbt-core is out of date!
  You can find instructions for upgrading here:
  https://docs.getdbt.com/docs/installation

Plugins:
  - starrocks: 1.4.2 - Not compatible!

  At least one plugin is out of date or incompatible with dbt-core.
  You can find instructions for upgrading here:
  https://docs.getdbt.com/docs/installation

StarRocksColumn(Column) does not support complex type

StarRocksColumn(Column) has no from_description overridden which means types like "array" is going to be incorrectly parsed in simply "array" as "" is going to be considered numeric precision or string length from default implementation. Could be useful to limit just for "(" for reading type from description.

Example:

  1. Create table
  2. Try to alter table
{{
    config(
        materialized='incremental',
        on_schema_change='append_new_columns'
    )
}}
select
    cast(NULL as int) as `atp_int`,
    cast(NULL as array<string>) as `atp_arr_string`
    where false

dbt-starrocks's optimize demand

Dbt-starrocks has been preliminarily implemented, but it still needs to be optimized.
This is a parent issue for dbt-starrocks. We will add TODOs to this issue.
TODOs:

  • 1. Table Support Duplicate Key/Unique Key/Primary Key #4
  • 2. Table Support Aggregate Key #5
  • 3. Column Support HLL \ BITMAP \ JSON types #6
  • 4. Seeds use Stream Load #7
  • 5. Partition could full refresh #8

Support External Catalogs in DBT Adapter

Feature request

Is your feature request related to a problem? Please describe.

No, but a blocker to use the DBT adapter.

Describe the solution you'd like

The SR DBT Connector supports reading from external tables/catalogs and writing materialized tables into SR internal storage.

We plan to use SR for querying our external Hive tables with disk caching instead of Presto/Athena. We have many processes around using DBT to transform data in Athena. It would be great to be able to replace them with SR instead.

Describe alternatives you've considered

No alternatives at the moment, DBT runs happen in Athena and we read the resulting data in SR.

Production status?

Hello all,

As we are investigating Starrocks for production, integration with DBT is critical for us.

I have seen from the other issues and comments that this adapter is still in early phases and will be added to DBT's official adapter documentation but I wanted to inquire if there was any roadmap or rough timeline that can be shared?

Thank you!

add adapter to dbt docs's "Available Adapters" page

The Available Adapters page is one of the dbt community's most-visited docs pages. It would be of great benefit for first-time visitors to the dbt docs to see:

  1. that this adapter is a possible option for using dbt-core, and
  2. how many large the dbt ecosystem of support databases is.

dbt-labs/docs.getdbt.com#1489 exists to address this with all as-of-yet undocumented adapters.

We just released Documenting a new adapter, a new guide on how to add an adapter to the Available Adapters page. I'd love to see this adapter on that page, so feel free to reach out with any questions/blockers by either replying to this issue, or posting in the #adapter-ecosystem channel of the dbt Community Slack.

Looking forward to the contribution!

Table supports multiple data models

  1. Table supports selecting one from Duplicate Key/Unique Key/Primary Key
  2. Table supports set ENGINE
  3. Table supports set keys
  4. Table supports set DISTRIBUTED BY
  5. Table supports set PROPERTIES
  6. Table supports set PARTITION BY

Field "type" of type Optional[StarRocksRelationType] in StarRocksRelation has invalid value 'cte'

CLIENT: Server listening on port 63061...
Received JSON data in run script
Running pytest with args: ['-p', 'vscode_pytest', '--rootdir=/Users/atwong/sandbox/dbt-starrocks', '/Users/atwong/sandbox/dbt-starrocks/tests/functional/adapter/test_basic.py::TestEphemeralMyAdapter::test_ephemeral']
============================= test session starts ==============================
platform darwin -- Python 3.9.16, pytest-8.1.1, pluggy-1.4.0
rootdir: /Users/atwong/sandbox/dbt-starrocks
configfile: pytest.ini
plugins: dotenv-0.5.2
collected 1 item

tests/functional/adapter/test_basic.py F                                 [100%]

=================================== FAILURES ===================================
____________________ TestEphemeralMyAdapter.test_ephemeral _____________________

self = <test_basic.TestEphemeralMyAdapter object at 0x109815250>
project = <dbt.tests.fixtures.project.TestProjInfo object at 0x107810c40>

    def test_ephemeral(self, project):
        # seed command
        results = run_dbt(["seed"])
        assert len(results) == 1
        check_result_nodes_by_name(results, ["base"])
    
        # run command
>       results = run_dbt(["run"])

/Users/atwong/dbt-env/lib/python3.9/site-packages/dbt/tests/adapter/basic/test_ephemeral.py:44: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = ['run', '--project-dir', '/private/var/folders/xn/_h8gfpfs3vv_m6gkxm8t0l380000gn/T/pytest-of-atwong/pytest-91/project0', '--profiles-dir', '/private/var/folders/xn/_h8gfpfs3vv_m6gkxm8t0l380000gn/T/pytest-of-atwong/pytest-91/profile0']
expect_pass = True

    def run_dbt(
        args: Optional[List[str]] = None,
        expect_pass: bool = True,
    ):
        # Ignore logbook warnings
        warnings.filterwarnings("ignore", category=DeprecationWarning, module="logbook")
    
        # reset global vars
        reset_metadata_vars()
    
        # The logger will complain about already being initialized if
        # we don't do this.
        log_manager.reset_handlers()
        if args is None:
            args = ["run"]
    
        print("\n\nInvoking dbt with {}".format(args))
        from dbt.flags import get_flags
    
        flags = get_flags()
        project_dir = getattr(flags, "PROJECT_DIR", None)
        profiles_dir = getattr(flags, "PROFILES_DIR", None)
        if project_dir and "--project-dir" not in args:
            args.extend(["--project-dir", project_dir])
        if profiles_dir and "--profiles-dir" not in args:
            args.extend(["--profiles-dir", profiles_dir])
    
        dbt = dbtRunner()
        res = dbt.invoke(args)
    
        # the exception is immediately raised to be caught in tests
        # using a pattern like `with pytest.raises(SomeException):`
        if res.exception is not None:
            raise res.exception
    
        if expect_pass is not None:
>           assert res.success == expect_pass, "dbt exit state did not match expected"
E           AssertionError: dbt exit state did not match expected

/Users/atwong/dbt-env/lib/python3.9/site-packages/dbt/tests/util.py:108: AssertionError
---------------------------- Captured stdout setup -----------------------------

=== Test project_root: /private/var/folders/xn/_h8gfpfs3vv_m6gkxm8t0l380000gn/T/pytest-of-atwong/pytest-91/project0
----------------------------- Captured stdout call -----------------------------


Invoking dbt with ['seed']
22:50:31  Running with dbt=1.6.2
22:50:31  Registered adapter: starrocks=1.4.2
22:50:31  Unable to do partial parsing because saved manifest not found. Starting full parse.
22:50:32  Found 3 models, 1 seed, 1 source, 0 exposures, 0 metrics, 335 macros, 0 groups, 0 semantic models
22:50:32  
22:50:32  Concurrency: 1 threads (target='default')
22:50:32  
22:50:32  1 of 1 START seed file test17102838317511145591_test_basic.base ................ [RUN]
22:50:32  1 of 1 OK loaded seed file test17102838317511145591_test_basic.base ............ [INSERT 10 in 0.27s]
22:50:32  
22:50:32  Finished running 1 seed in 0 hours 0 minutes and 0.34 seconds (0.34s).
22:50:32  
22:50:32  Completed successfully
22:50:32  
22:50:32  Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1


Invoking dbt with ['run']
22:50:32  Running with dbt=1.6.2
22:50:32  Registered adapter: starrocks=1.4.2
22:50:32  Found 3 models, 1 seed, 1 source, 0 exposures, 0 metrics, 335 macros, 0 groups, 0 semantic models
22:50:32  
22:50:32  Concurrency: 1 threads (target='default')
22:50:32  
22:50:32  1 of 2 START sql table model test17102838317511145591_test_basic.table_model ... [RUN]
22:50:32  Unhandled error while executing 
Field "type" of type Optional[StarRocksRelationType] in StarRocksRelation has invalid value 'cte'
22:50:32  1 of 2 ERROR creating sql table model test17102838317511145591_test_basic.table_model  [ERROR in 0.01s]
22:50:32  2 of 2 START sql view model test17102838317511145591_test_basic.view_model ..... [RUN]
22:50:32  Unhandled error while executing 
Field "type" of type Optional[StarRocksRelationType] in StarRocksRelation has invalid value 'cte'
22:50:32  2 of 2 ERROR creating sql view model test17102838317511145591_test_basic.view_model  [ERROR in 0.00s]
22:50:32  
22:50:32  Finished running 1 table model, 1 view model in 0 hours 0 minutes and 0.06 seconds (0.06s).
22:50:32  
22:50:32  Completed with 2 errors and 0 warnings:
22:50:32  
22:50:32    Field "type" of type Optional[StarRocksRelationType] in StarRocksRelation has invalid value 'cte'
22:50:32  
22:50:32    Field "type" of type Optional[StarRocksRelationType] in StarRocksRelation has invalid value 'cte'
22:50:32  
22:50:32  Done. PASS=0 WARN=0 ERROR=2 SKIP=0 TOTAL=2
=========================== short test summary info ============================
FAILED tests/functional/adapter/test_basic.py::TestEphemeralMyAdapter::test_ephemeral
============================== 1 failed in 1.33s ===============================
Finished running tests!

dbt-starrocks 1.4.2 does not work with the the latest dbt-core major release 1.7.*

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
dbt-starrocks 1.4.2 requires dbt-core==1.6.2, but you have dbt-core 1.7.7 which is incompatible.
Successfully installed dbt-core-1.7.7 dbt-extractor-0.5.1 dbt-semantic-interfaces-0.4.3 mashumaro-3.12
(dbt-env) atwong@Albert-CelerData ~ % dbt  --version
Core:
  - installed: 1.7.7
  - latest:    1.7.7 - Up to date!

Plugins:
  - starrocks: 1.4.2 - Not compatible!

  At least one plugin is out of date or incompatible with dbt-core.
  You can find instructions for upgrading here:
  https://docs.getdbt.com/docs/installation

upgrade to support dbt-core v1.4.0

Background

The latest version of dbt Core,dbt-core==1.4.0, was published on January 25, 2023 (PyPI | Github). In fact, a patch, dbt-core==1.4.1 (PyPI | Github), was also released on the same day.

How to upgrade

dbt-labs/dbt-core#6624 is an open discussion with more detailed information. If you have questions, please put them there! dbt-labs/dbt-core#6849 is for keeping track of the community's progress on releasing 1.4.0

The above linked guide has more information, but below is a high-level checklist of work that would enable a successful 1.4.0 release of your adapter.

  • support Python 3.11 (only if your adapter's dependencies allow)
  • Consolidate timestamp functions & macros
  • Replace deprecated exception functions
  • Add support for more tests

the next minor release: 1.5.0

FYI, dbt-core==1.5.0 is expected to be released at the end of April. Please plan on allocating a more effort to upgrade support compared to previous minor versions. Expect to hear more in the middle of April.

At a high-level expect much greater adapter test coverage (a very good thing!), and some likely heaving renaming and restructuring as the API-ification of dbt-core is now well underway. See https://github.com/dbt-labs/dbt-core/milestone/82 for more information.

StarRocksColumn(Column) has a bug in treating String fields

SR version of Column does not override parent's string_type which means that we are using "character varying" as a type name for String fields. This type does not exist in SR.

Steps to reproduce

  1. Create any table
  2. Try to add a column to a table via:
{{
    config(
        materialized='incremental',
        on_schema_change='append_new_columns'
    )
}}
select
    cast(NULL as varchar) as `atp_new`
    where false
  1. Observe logs:
    add column atp_new character varying(1048576)

  2. DBT run finished with error

Expected behavior

1, 2 as above
Logs should show something like:
add column atp_new varchar(65535)

And run should finish with success.

Versions

Core:
  - installed: 1.6.11
  - latest:    1.7.11 - Update available!

  Your version of dbt-core is out of date!
  You can find instructions for upgrading here:
  https://docs.getdbt.com/docs/installation

Plugins:
  - starrocks: 1.6.1 - Up to date!

Suggested fix

@classmethod
    def string_type(cls, size: int) -> str:
        return "varchar({})".format(size)

unsupported rename from table to view

CLIENT: Server listening on port 49466...
Received JSON data in run script
Running pytest with args: ['-p', 'vscode_pytest', '--rootdir=/Users/atwong/sandbox/dbt-starrocks', '/Users/atwong/sandbox/dbt-starrocks/tests/functional/adapter/test_basic.py::TestSimpleMaterializationsMyAdapter::test_base']
============================= test session starts ==============================
platform darwin -- Python 3.9.16, pytest-8.1.1, pluggy-1.4.0
rootdir: /Users/atwong/sandbox/dbt-starrocks
configfile: pytest.ini
plugins: dotenv-0.5.2
collected 1 item

tests/functional/adapter/test_basic.py F                                 [100%]

=================================== FAILURES ===================================
________________ TestSimpleMaterializationsMyAdapter.test_base _________________

self = <test_basic.TestSimpleMaterializationsMyAdapter object at 0x1190cbfa0>
project = <dbt.tests.fixtures.project.TestProjInfo object at 0x1070c85b0>

    def test_base(self, project):
    
        # seed command
        results = run_dbt(["seed"])
        # seed result length
        assert len(results) == 1
    
        # run command
        results = run_dbt()
        # run result length
        assert len(results) == 3
    
        # names exist in result nodes
        check_result_nodes_by_name(results, ["view_model", "table_model", "swappable"])
    
        # check relation types
        expected = {
            "base": "table",
            "view_model": "view",
            "table_model": "table",
            "swappable": "table",
        }
        check_relation_types(project.adapter, expected)
    
        # base table rowcount
        relation = relation_from_name(project.adapter, "base")
        result = project.run_sql(f"select count(*) as num_rows from {relation}", fetch="one")
        assert result[0] == 10
    
        # relations_equal
        check_relations_equal(project.adapter, ["base", "view_model", "table_model", "swappable"])
    
        # check relations in catalog
        catalog = run_dbt(["docs", "generate"])
        assert len(catalog.nodes) == 4
        assert len(catalog.sources) == 1
    
        # run_dbt changing materialized_var to view
        if project.test_config.get("require_full_refresh", False):  # required for BigQuery
            results = run_dbt(
                ["run", "--full-refresh", "-m", "swappable", "--vars", "materialized_var: view"]
            )
        else:
            results = run_dbt(["run", "-m", "swappable", "--vars", "materialized_var: view"])
        assert len(results) == 1
    
        # check relation types, swappable is view
        expected = {
            "base": "table",
            "view_model": "view",
            "table_model": "table",
            "swappable": "view",
        }
        check_relation_types(project.adapter, expected)
    
        # run_dbt changing materialized_var to incremental
>       results = run_dbt(["run", "-m", "swappable", "--vars", "materialized_var: incremental"])

/Users/atwong/dbt-env/lib/python3.9/site-packages/dbt/tests/adapter/basic/test_base.py:96: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = ['run', '-m', 'swappable', '--vars', 'materialized_var: incremental', '--project-dir', ...]
expect_pass = True

    def run_dbt(
        args: Optional[List[str]] = None,
        expect_pass: bool = True,
    ):
        # Ignore logbook warnings
        warnings.filterwarnings("ignore", category=DeprecationWarning, module="logbook")
    
        # reset global vars
        reset_metadata_vars()
    
        # The logger will complain about already being initialized if
        # we don't do this.
        log_manager.reset_handlers()
        if args is None:
            args = ["run"]
    
        print("\n\nInvoking dbt with {}".format(args))
        from dbt.flags import get_flags
    
        flags = get_flags()
        project_dir = getattr(flags, "PROJECT_DIR", None)
        profiles_dir = getattr(flags, "PROFILES_DIR", None)
        if project_dir and "--project-dir" not in args:
            args.extend(["--project-dir", project_dir])
        if profiles_dir and "--profiles-dir" not in args:
            args.extend(["--profiles-dir", profiles_dir])
    
        dbt = dbtRunner()
        res = dbt.invoke(args)
    
        # the exception is immediately raised to be caught in tests
        # using a pattern like `with pytest.raises(SomeException):`
        if res.exception is not None:
            raise res.exception
    
        if expect_pass is not None:
>           assert res.success == expect_pass, "dbt exit state did not match expected"
E           AssertionError: dbt exit state did not match expected

/Users/atwong/dbt-env/lib/python3.9/site-packages/dbt/tests/util.py:108: AssertionError
---------------------------- Captured stdout setup -----------------------------

=== Test project_root: /private/var/folders/xn/_h8gfpfs3vv_m6gkxm8t0l380000gn/T/pytest-of-atwong/pytest-79/project0
----------------------------- Captured stdout call -----------------------------


Invoking dbt with ['seed']
15:29:36  Running with dbt=1.6.2
15:29:36  Registered adapter: starrocks=1.4.2
15:29:36  Unable to do partial parsing because saved manifest not found. Starting full parse.
15:29:37  Found 3 models, 1 seed, 1 source, 0 exposures, 0 metrics, 335 macros, 0 groups, 0 semantic models
15:29:37  
15:29:37  Concurrency: 1 threads (target='default')
15:29:37  
15:29:37  1 of 1 START seed file test17102573770929885194_test_basic.base ................ [RUN]
15:29:37  1 of 1 OK loaded seed file test17102573770929885194_test_basic.base ............ [INSERT 10 in 0.25s]
15:29:37  
15:29:37  Finished running 1 seed in 0 hours 0 minutes and 0.34 seconds (0.34s).
15:29:37  
15:29:37  Completed successfully
15:29:37  
15:29:37  Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1


Invoking dbt with ['run']
15:29:37  Running with dbt=1.6.2
15:29:37  Registered adapter: starrocks=1.4.2
15:29:37  Found 3 models, 1 seed, 1 source, 0 exposures, 0 metrics, 335 macros, 0 groups, 0 semantic models
15:29:37  
15:29:37  Concurrency: 1 threads (target='default')
15:29:37  
15:29:37  1 of 3 START sql table model test17102573770929885194_test_basic.swappable ..... [RUN]
15:29:37  1 of 3 OK created sql table model test17102573770929885194_test_basic.swappable  [SUCCESS 10 in 0.22s]
15:29:37  2 of 3 START sql table model test17102573770929885194_test_basic.table_model ... [RUN]
15:29:38  2 of 3 OK created sql table model test17102573770929885194_test_basic.table_model  [SUCCESS 10 in 0.20s]
15:29:38  3 of 3 START sql view model test17102573770929885194_test_basic.view_model ..... [RUN]
15:29:38  3 of 3 OK created sql view model test17102573770929885194_test_basic.view_model  [SUCCESS 0 in 0.04s]
15:29:38  
15:29:38  Finished running 2 table models, 1 view model in 0 hours 0 minutes and 0.50 seconds (0.50s).
15:29:38  
15:29:38  Completed successfully
15:29:38  
15:29:38  Done. PASS=3 WARN=0 ERROR=0 SKIP=0 TOTAL=3


Invoking dbt with ['docs', 'generate']
15:29:38  Running with dbt=1.6.2
15:29:38  Registered adapter: starrocks=1.4.2
15:29:38  Found 3 models, 1 seed, 1 source, 0 exposures, 0 metrics, 335 macros, 0 groups, 0 semantic models
15:29:38  
15:29:38  Concurrency: 1 threads (target='default')
15:29:38  
15:29:38  Building catalog
15:29:38  Catalog written to /private/var/folders/xn/_h8gfpfs3vv_m6gkxm8t0l380000gn/T/pytest-of-atwong/pytest-79/project0/target/catalog.json


Invoking dbt with ['run', '-m', 'swappable', '--vars', 'materialized_var: view']
15:29:38  Running with dbt=1.6.2
15:29:38  Registered adapter: starrocks=1.4.2
15:29:38  Unable to do partial parsing because config vars, config profile, or config target have changed
15:29:38  Found 3 models, 1 seed, 1 source, 0 exposures, 0 metrics, 335 macros, 0 groups, 0 semantic models
15:29:38  
15:29:38  Concurrency: 1 threads (target='default')
15:29:38  
15:29:38  1 of 1 START sql view model test17102573770929885194_test_basic.swappable ...... [RUN]
15:29:39  1 of 1 OK created sql view model test17102573770929885194_test_basic.swappable . [SUCCESS 0 in 0.05s]
15:29:39  
15:29:39  Finished running 1 view model in 0 hours 0 minutes and 0.11 seconds (0.11s).
15:29:39  
15:29:39  Completed successfully
15:29:39  
15:29:39  Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1


Invoking dbt with ['run', '-m', 'swappable', '--vars', 'materialized_var: incremental']
15:29:39  Running with dbt=1.6.2
15:29:39  Registered adapter: starrocks=1.4.2
15:29:39  Unable to do partial parsing because config vars, config profile, or config target have changed
15:29:39  Found 3 models, 1 seed, 1 source, 0 exposures, 0 metrics, 335 macros, 0 groups, 0 semantic models
15:29:39  
15:29:39  Concurrency: 1 threads (target='default')
15:29:39  
15:29:39  1 of 1 START sql incremental model test17102573770929885194_test_basic.swappable  [RUN]
15:29:39  1 of 1 ERROR creating sql incremental model test17102573770929885194_test_basic.swappable  [ERROR in 0.22s]
15:29:39  
15:29:39  Finished running 1 incremental model in 0 hours 0 minutes and 0.28 seconds (0.28s).
15:29:39  
15:29:39  Completed with 1 error and 0 warnings:
15:29:39  
15:29:39    Compilation Error in macro rename_relation (macros/adapters/relation.sql)
  unsupported rename from table to view
  
  > in macro statement (macros/etc/statement.sql)
  > called by macro starrocks__rename_relation (macros/adapters/relation.sql)
  > called by macro rename_relation (macros/adapters/relation.sql)
  > called by macro materialization_incremental_default (macros/materializations/models/incremental/incremental.sql)
  > called by macro rename_relation (macros/adapters/relation.sql)
15:29:39  
15:29:39  Done. PASS=0 WARN=0 ERROR=1 SKIP=0 TOTAL=1
=========================== short test summary info ============================
FAILED tests/functional/adapter/test_basic.py::TestSimpleMaterializationsMyAdapter::test_base
============================== 1 failed in 3.34s ===============================
Finished running tests!

Exception has occurred: CompilationError       (note: full exception trace is shown but execution is paused at: exception_handler)
Compilation Error in macro rename_relation (macros/adapters/relation.sql)
  unsupported rename from table to view
  
  > in macro statement (macros/etc/statement.sql)
  > called by macro starrocks__rename_relation (macros/adapters/relation.sql)
  > called by macro rename_relation (macros/adapters/relation.sql)
  > called by macro rename_relation (macros/adapters/relation.sql)
  File "/Users/atwong/sandbox/dbt-starrocks/dbt/adapters/starrocks/connections.py", line 176, in exception_handler (Current frame)
    yield
dbt.exceptions.CompilationError: Compilation Error in macro rename_relation (macros/adapters/relation.sql)
  unsupported rename from table to view

upgrade to support dbt-core v1.3.0

Background

The latest release cut for 1.3.0, dbt-core==1.3.0rc2 was published on October 3, 2022 (PyPI | Github). We are targeting releasing the official cut of 1.3.0 in time for the week of October 16 (in time for Coalesce conference).

We're trying to establish a following precedent w.r.t. minor versions:
Partner adapter maintainers release their adapter's minor version within four weeks of the initial RC being released. Given the delay on our side in notifying you, we'd like to set a target date of November 7 (four weeks from today) for maintainers to release their minor version

Timeframe Date (intended) Date (Actual) Event
D - 3 weeks Sep 21 Oct 10 dbt Labs informs maintainers of upcoming minor release
D - 2 weeks Sep 28 Sep 28 core 1.3 RC is released
Day D October 12 Oct 12 core 1.3 official is published
D + 2 weeks October 26 Nov 7 dbt-adapter 1.3 is published

How to upgrade

dbt-labs/dbt-core#6011 is an open discussion with more detailed information, and dbt-labs/dbt-core#6040 is for keeping track of the community's progress on releasing 1.2.0

Below is a checklist of work that would enable a successful 1.2.0 release of your adapter.

  • Python Models (if applicable)
  • Incremental Materialization: cleanup and standardization
  • More functional adapter tests to inherit

CMySQLConnection' object has no attribute 'server_version

python3 -m pytest tests/functional
23:07:02  Finished running 1 seed in 0 hours 0 minutes and 0.10 seconds (0.10s).
23:07:02
23:07:02  Completed with 1 error and 0 warnings:
23:07:02
23:07:02    'CMySQLConnection' object has no attribute 'server_version'
23:07:02
23:07:02  Done. PASS=0 WARN=0 ERROR=1 SKIP=0 TOTAL=1

Happens when I only provide x.y release not x.y.z

(dbt-env) atwong@Albert-CelerData dbt-starrocks % cat tests/conftest.py
import pytest
import os

# Import the standard functional fixtures as a plugin
# Note: fixtures with session scope need to be local
pytest_plugins = ["dbt.tests.fixtures.project"]

# The profile dictionary, used to write out profiles.yml
# dbt will supply a unique schema per test, so we do not specify 'schema' here
@pytest.fixture(scope="class")
def dbt_profile_target():
    return {
        'type': 'starrocks',
        'threads': 1,
        'server': 'localhost',
        'username': 'root',
        'password': '',
        'port': 9030,
        'ssl_disabled': True,
        'version': '3.1'
    }

[materialized_view] why do not use swap instead of drop then create

After changed materialized view(or called MV in rest of content), we will drop exist MV then create as follow code:

{% macro starrocks__get_replace_materialized_view_as_sql(relation, sql, existing_relation, backup_relation, intermediate_relation) %}
    {{ starrocks__get_drop_relation_sql(existing_relation) }}
    {{ get_create_materialized_view_as_sql(relation, sql) }}
{% endmacro %}

And I think there are better way to alter asynchronous MV because starrocks support atomic swap as this doc.
I will fork as internal project to use swap but also want to konw more ideas of community.

'int' object has no attribute 'isdigit' when version is set to 3.1.9

python3 -m pytest tests/functional
23:09:25  Completed with 3 errors and 0 warnings:
23:09:25
23:09:25    'int' object has no attribute 'isdigit'
23:09:25
23:09:25    'int' object has no attribute 'isdigit'
23:09:25
23:09:25    'int' object has no attribute 'isdigit'
23:09:25
23:09:25  Done. PASS=0 WARN=0 ERROR=3 SKIP=0 TOTAL=3
========================================================================================= short test summary info ==========================================================================================
FAILED tests/functional/adapter/test_basic.py::TestSimpleMaterializationsMyAdapter::test_base - AssertionError: dbt exit state did not match expected
FAILED tests/functional/adapter/test_basic.py::TestSingularTestsEphemeralMyAdapter::test_singular_tests_ephemeral - AssertionError: dbt exit state did not match expected
FAILED tests/functional/adapter/test_basic.py::TestEphemeralMyAdapter::test_ephemeral - AssertionError: dbt exit state did not match expected
FAILED tests/functional/adapter/test_basic.py::TestIncrementalMyAdapter::test_incremental - AssertionError: dbt exit state did not match expected
FAILED tests/functional/adapter/test_basic.py::TestGenericTestsMyAdapter::test_generic_tests - AssertionError: dbt exit state did not match expected
FAILED tests/functional/adapter/test_basic.py::TestSnapshotCheckColsMyAdapter::test_snapshot_check_cols - AssertionError: dbt exit state did not match expected
FAILED tests/functional/adapter/test_basic.py::TestSnapshotTimestampMyAdapter::test_snapshot_timestamp - AssertionError: dbt exit state did not match expected
======================================================================================= 7 failed, 3 passed in 7.70s ========================================================================================
(dbt-env) atwong@Albert-CelerData dbt-starrocks % cat tests/conftest.py
import pytest
import os

# Import the standard functional fixtures as a plugin
# Note: fixtures with session scope need to be local
pytest_plugins = ["dbt.tests.fixtures.project"]

# The profile dictionary, used to write out profiles.yml
# dbt will supply a unique schema per test, so we do not specify 'schema' here
@pytest.fixture(scope="class")
def dbt_profile_target():
    return {
        'type': 'starrocks',
        'threads': 1,
        'server': 'localhost',
        'username': 'root',
        'password': '',
        'port': 9030,
        'ssl_disabled': True,
        'version': '3.1.9'
    }

StarRocks error: 1064 (HY000): Insert has filtered data in strict mode, txn_id = 4 tracking sql = select tracking_log from information_schema.load_tracking_logs where job_id=11119

Stuck at step 5. https://docs.getdbt.com/guides/manual-install?step=5

(dbt-env) atwong@Albert-CelerData jaffle_shop % dbt run -d
17:10:53  Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'start', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x103a53760>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1087a1490>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1087a13a0>]}
17:10:53  Running with dbt=1.6.10
17:10:53  running dbt with arguments {'printer_width': '80', 'indirect_selection': 'eager', 'write_json': 'True', 'log_cache_events': 'False', 'partial_parse': 'True', 'cache_selected_only': 'False', 'warn_error': 'None', 'fail_fast': 'False', 'debug': 'True', 'log_path': '/Users/atwong/sandbox/dbt-tutorial/jaffle_shop/logs', 'version_check': 'True', 'profiles_dir': '/Users/atwong/.dbt', 'use_colors': 'True', 'use_experimental_parser': 'False', 'no_print': 'None', 'quiet': 'False', 'warn_error_options': 'WarnErrorOptions(include=[], exclude=[])', 'static_parser': 'True', 'log_format': 'default', 'introspect': 'True', 'target_path': 'None', 'invocation_command': 'dbt run -d', 'send_anonymous_usage_stats': 'True'}
17:10:53  Sending event: {'category': 'dbt', 'action': 'project_id', 'label': 'c587bbfa-b9bc-4b0c-9ce7-2451bb59989d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1086b1f70>]}
17:10:53  Sending event: {'category': 'dbt', 'action': 'adapter_info', 'label': 'c587bbfa-b9bc-4b0c-9ce7-2451bb59989d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1092b8f40>]}
17:10:53  Registered adapter: starrocks=1.6.1
17:10:53  checksum: 28908e88b83a05550f08d8d5005b031bfd1e6cf5c944d3c74469cec401f04961, vars: {}, profile: , target: , version: 1.6.10
17:10:53  Partial parsing enabled: 0 files deleted, 0 files added, 0 files changed.
17:10:53  Partial parsing enabled, no changes found, skipping parsing
17:10:53  Sending event: {'category': 'dbt', 'action': 'load_project', 'label': 'c587bbfa-b9bc-4b0c-9ce7-2451bb59989d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1094350d0>]}
17:10:53  Sending event: {'category': 'dbt', 'action': 'resource_counts', 'label': 'c587bbfa-b9bc-4b0c-9ce7-2451bb59989d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1093fbaf0>]}
17:10:53  Found 2 models, 4 tests, 0 sources, 0 exposures, 0 metrics, 338 macros, 0 groups, 0 semantic models
17:10:53  Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': 'c587bbfa-b9bc-4b0c-9ce7-2451bb59989d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10936e100>]}
17:10:53
17:10:53  Acquiring new starrocks connection 'master'
17:10:53  Acquiring new starrocks connection 'list_schemas'
17:10:53  Using starrocks connection "list_schemas"
17:10:53  On list_schemas: /* {"app": "dbt", "dbt_version": "1.6.10", "profile_name": "jaffle_shop", "target_name": "dev", "connection_name": "list_schemas"} */
select distinct schema_name from information_schema.schemata
17:10:53  Opening a new connection, currently in state init
17:10:53  SQL status: SUCCESS 5 in 0.0 seconds
17:10:53  On list_schemas: Close
17:10:53  Re-using an available connection from the pool (formerly list_schemas, now list_None_testing)
17:10:53  Using starrocks connection "list_None_testing"
17:10:53  On list_None_testing: BEGIN
17:10:53  Opening a new connection, currently in state closed
17:10:53  SQL status: SUCCESS 0 in 0.0 seconds
17:10:53  Using starrocks connection "list_None_testing"
17:10:53  On list_None_testing: /* {"app": "dbt", "dbt_version": "1.6.10", "profile_name": "jaffle_shop", "target_name": "dev", "connection_name": "list_None_testing"} */

    select
      null as "database",
      tbl.table_name as name,
      tbl.table_schema as "schema",
      case when tbl.table_type = 'BASE TABLE' then 'table'
           when tbl.table_type = 'VIEW' and mv.table_name is null then 'view'
           when tbl.table_type = 'VIEW' and mv.table_name is not null then 'materialized_view'
           when tbl.table_type = 'SYSTEM VIEW' then 'system_view'
           else 'unknown' end as table_type
    from information_schema.tables tbl
    left join information_schema.materialized_views mv
    on tbl.TABLE_SCHEMA = mv.TABLE_SCHEMA
    and tbl.TABLE_NAME = mv.TABLE_NAME
    where tbl.table_schema = 'testing'

17:10:53  SQL status: SUCCESS 0 in 0.0 seconds
17:10:53  On list_None_testing: ROLLBACK
17:10:53  On list_None_testing: Close
17:10:53  Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': 'c587bbfa-b9bc-4b0c-9ce7-2451bb59989d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x109425c40>]}
17:10:53  Using starrocks connection "master"
17:10:53  On master: BEGIN
17:10:53  Opening a new connection, currently in state init
17:10:53  SQL status: SUCCESS 0 in 0.0 seconds
17:10:53  On master: COMMIT
17:10:53  Using starrocks connection "master"
17:10:53  On master: COMMIT
17:10:53  SQL status: SUCCESS 0 in 0.0 seconds
17:10:53  On master: Close
17:10:53  Concurrency: 1 threads (target='dev')
17:10:53
17:10:53  Began running node model.jaffle_shop.my_first_dbt_model
17:10:53  1 of 2 START sql table model testing.my_first_dbt_model ........................ [RUN]
17:10:53  Re-using an available connection from the pool (formerly list_None_testing, now model.jaffle_shop.my_first_dbt_model)
17:10:53  Began compiling node model.jaffle_shop.my_first_dbt_model
17:10:53  Writing injected SQL for node "model.jaffle_shop.my_first_dbt_model"
17:10:53  Timing info for model.jaffle_shop.my_first_dbt_model (compile): 10:10:53.865544 => 10:10:53.870432
17:10:53  Began executing node model.jaffle_shop.my_first_dbt_model
17:10:53  Opening a new connection, currently in state closed
17:10:53  Writing runtime sql for node "model.jaffle_shop.my_first_dbt_model"
17:10:53  Using starrocks connection "model.jaffle_shop.my_first_dbt_model"
17:10:53  On model.jaffle_shop.my_first_dbt_model: BEGIN
17:10:53  SQL status: SUCCESS 0 in 0.0 seconds
17:10:53  Using starrocks connection "model.jaffle_shop.my_first_dbt_model"
17:10:53  On model.jaffle_shop.my_first_dbt_model: /* {"app": "dbt", "dbt_version": "1.6.10", "profile_name": "jaffle_shop", "target_name": "dev", "node_id": "model.jaffle_shop.my_first_dbt_model"} */




  create table `testing`.`my_first_dbt_model__dbt_tmp`
    PROPERTIES (
      "replication_num" = "1"
    )
  as /*
    Welcome to your first dbt model!
    Did you know that you can also configure models directly within SQL files?
    This will override configurations stated in dbt_project.yml

    Try changing "table" to "view" below
*/



with source_data as (

    select 1 as id
    union all
    select null as id

)

select *
from source_data

/*
    Uncomment the line below to remove records with null `id` values
*/

-- where id is not null

17:10:53  starrocks adapter: StarRocks error: 1064 (HY000): Insert has filtered data in strict mode, txn_id = 4 tracking sql = select tracking_log from information_schema.load_tracking_logs where job_id=11119
17:10:53  On model.jaffle_shop.my_first_dbt_model: ROLLBACK
17:10:53  Timing info for model.jaffle_shop.my_first_dbt_model (execute): 10:10:53.870915 => 10:10:53.947047
17:10:53  On model.jaffle_shop.my_first_dbt_model: Close
17:10:53  Database Error in model my_first_dbt_model (models/example/my_first_dbt_model.sql)
  1064 (HY000): Insert has filtered data in strict mode, txn_id = 4 tracking sql = select tracking_log from information_schema.load_tracking_logs where job_id=11119
  compiled Code at target/run/jaffle_shop/models/example/my_first_dbt_model.sql
17:10:53  Sending event: {'category': 'dbt', 'action': 'run_model', 'label': 'c587bbfa-b9bc-4b0c-9ce7-2451bb59989d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10947b700>]}
17:10:53  1 of 2 ERROR creating sql table model testing.my_first_dbt_model ............... [ERROR in 0.09s]
17:10:53  Finished running node model.jaffle_shop.my_first_dbt_model
17:10:53  Began running node model.jaffle_shop.my_second_dbt_model
17:10:53  2 of 2 SKIP relation testing.my_second_dbt_model ............................... [SKIP]
17:10:53  Finished running node model.jaffle_shop.my_second_dbt_model
17:10:53  Using starrocks connection "master"
17:10:53  On master: BEGIN
17:10:53  Opening a new connection, currently in state closed
17:10:53  SQL status: SUCCESS 0 in 0.0 seconds
17:10:53  On master: COMMIT
17:10:53  Using starrocks connection "master"
17:10:53  On master: COMMIT
17:10:53  SQL status: SUCCESS 0 in 0.0 seconds
17:10:53  On master: Close
17:10:53  Connection 'master' was properly closed.
17:10:53  Connection 'model.jaffle_shop.my_first_dbt_model' was properly closed.
17:10:53
17:10:53  Finished running 1 table model, 1 view model in 0 hours 0 minutes and 0.20 seconds (0.20s).
17:10:53  Command end result
17:10:53
17:10:53  Completed with 1 error and 0 warnings:
17:10:53
17:10:53    Database Error in model my_first_dbt_model (models/example/my_first_dbt_model.sql)
  1064 (HY000): Insert has filtered data in strict mode, txn_id = 4 tracking sql = select tracking_log from information_schema.load_tracking_logs where job_id=11119
  compiled Code at target/run/jaffle_shop/models/example/my_first_dbt_model.sql
17:10:53
17:10:53  Done. PASS=0 WARN=0 ERROR=1 SKIP=1 TOTAL=2
17:10:53  Command `dbt run` failed at 10:10:53.986391 after 0.37 seconds
17:10:53  Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x103a53760>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1092bf250>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x109470310>]}
17:10:53  Flushing usage events

starrocks-adapter is not available

Use this starrocks-adapter:
envs:
python-3.7.4
dbt-core-1.1.0
starrocks-2.1.4

The following problems occur:

  1. When creating a project (use 'dbt init'), there is no starrocks adapter to choose from
    1)install dbt-starrocks(use 'pip install .' )
    2)creating a project (use 'dbt init')
    3)There is no 'starrocks-adapter' option

  2. sample_profiles.yml format error(Indent error):
    `
    default:
    outputs:
    dev:
    type: starrocks
    host:
    port: <port_num>
    username:
    password:
    database:
    target: dev

`

  1. executing 'dbt run', connection error:
    `

Database Error
(1064, "Unknown character set: 'latin1'")
`

  1. When testing 'starrocks.dbtspec test_dbt_base: base', the following exceptions occur
    Expected only one database in get_catalog
    because of exists two 'information_schema':
  1. {database}.information_schema
  2. information_schema
  1. Execute RENAME when creating a view, that is, execute the following commands:
    create view {{ to_relation }} as {{ results[0]['Create View'].replace(from_relation.table, to_relation.table).split('AS',1)[1] }} drop view if exists {{ from_relation }};
    'unknown error' appears. Two SQL statements cannot be executed at the same time

upgrade to support dbt-core v1.5.0

Background

The latest version of dbt Core,dbt-core==1.5.0rc1, was published on April 13, 2023 (PyPI | Github).

How to upgrade

dbt-labs/dbt-core#7213 is an open discussion with more detailed information. If you have questions, please put them there!

The above linked guide has more information, but below is a high-level checklist of work that would enable a successful 1.5.0 release of your adapter.

  • Add support Python 3.11 (if you haven't already)
  • Add support for relevant tests (there's a lot of new ones!)
  • Add support model contracts
  • Add support for materialized views (this likely will be bumped to 1.6.0)

the next minor release: 1.6.0

FYI, dbt-core==1.6.0 is expected to be released at the end of July, with a release cut at least two weeks prior.

dbt tests expect that you can dynamically create database on db connection

Exception has occurred: DatabaseError
1064 (HY000): Unknown database 'test17102070719455983730_test_basic'
_mysql_connector.MySQLInterfaceError: Unknown database 'test17102070719455983730_test_basic'

The above exception was the direct cause of the following exception:

  File "/Users/atwong/sandbox/dbt-starrocks/dbt/adapters/starrocks/connections.py", line 113, in open
    connection.handle = mysql.connector.connect(**kwargs)
mysql.connector.errors.DatabaseError: 1064 (HY000): Unknown database 'test17102070719455983730_test_basic'

StarRocks DBT connector to support schema evolution

Does our StarRocks DBT connector handle the following DBT schema evolution changes? dbt-labs/dbt-spark#124 (comment)

Handle schema evolution with and without partitioning.

* The full import, without partitioning. As @charlottevdscheun mentioned earlier in https://github.com/dbt-labs/dbt-spark/pull/117. Currently we use INSERT INTO, but I would suggest to replace this by CREATE OR REPLACE TABLE, this allows us to atomically update the table, but also allowing changing the schema. This will keep full history of the table, and is fully supported by Delta.
* In the case of partition by, forward compatible schema evolution is allowed. We can add new fields to partitions, and they will be just null for the other partitions.

[dbt-starrocks] All macros fail with type error if profiles.yml "version" is not set.

Steps to reproduce the behavior (Required)

  1. Create a seed file;
  2. Make sure that no version is specified in your project's profiles.yml;
  3. run dbt seed.

Behavior (Required)

You get this error:

15:37:58    Compilation Error in seed my_seed (seeds/my_seed.csv)
  '>' not supported between instances of 'int' and 'str'
  
  > in macro starrocks__create_csv_table (macros/materializations/seeds.sql)
  > called by macro create_csv_table (macros/materializations/seeds/helpers.sql)
  > called by macro materialization_seed_default (macros/materializations/seeds/seed.sql)
  > called by seed my_seed (seeds/my_seed.csv)

I ran a debugger and it seems to fail here: https://github.com/StarRocks/starrocks/blob/5ca5ff0108d6a47f17d421ae37f66ebbec5775e4/contrib/dbt-connector/dbt/adapters/starrocks/impl.py#L149

… due to this line returning a string-tuple: https://github.com/StarRocks/starrocks/blob/5ca5ff0108d6a47f17d421ae37f66ebbec5775e4/contrib/dbt-connector/dbt/adapters/starrocks/connections.py#L92

… in turn happening when version is not explicitly defined, here:
https://github.com/StarRocks/starrocks/blob/5ca5ff0108d6a47f17d421ae37f66ebbec5775e4/contrib/dbt-connector/dbt/adapters/starrocks/connections.py#L136-L140

StarRocks version (Required)

3.2.2-269e832

mysql.connector.errors.InternalError: Unread result found

Invoking dbt with ['seed']
02:55:39  Running with dbt=1.6.2
02:55:39  Registered adapter: starrocks=1.4.2
02:55:39  Unable to do partial parsing because saved manifest not found. Starting full parse.
02:55:40  Found 3 models, 1 seed, 1 source, 0 exposures, 0 metrics, 335 macros, 0 groups, 0 semantic models
02:55:40  
02:55:40  Concurrency: 1 threads (target='default')
02:55:40  
02:55:40  1 of 1 START seed file test17102121381979843801_test_basic.base ................ [RUN]
02:55:41  1 of 1 OK loaded seed file test17102121381979843801_test_basic.base ............ [INSERT 10 in 0.44s]
02:55:41  
02:55:41  Finished running 1 seed in 0 hours 0 minutes and 0.56 seconds (0.56s).
02:55:41  
02:55:41  Completed successfully
02:55:41  
02:55:41  Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1


Invoking dbt with ['run']
02:55:41  Running with dbt=1.6.2
02:55:41  Registered adapter: starrocks=1.4.2
02:55:41  Found 3 models, 1 seed, 1 source, 0 exposures, 0 metrics, 335 macros, 0 groups, 0 semantic models
02:55:41  
02:55:41  Concurrency: 1 threads (target='default')
02:55:41  
02:55:41  1 of 3 START sql table model test17102121381979843801_test_basic.swappable ..... [RUN]
02:55:41  1 of 3 OK created sql table model test17102121381979843801_test_basic.swappable  [SUCCESS 10 in 0.41s]
02:55:41  2 of 3 START sql table model test17102121381979843801_test_basic.table_model ... [RUN]
02:55:42  2 of 3 OK created sql table model test17102121381979843801_test_basic.table_model  [SUCCESS 10 in 0.27s]
02:55:42  3 of 3 START sql view model test17102121381979843801_test_basic.view_model ..... [RUN]
02:55:42  3 of 3 OK created sql view model test17102121381979843801_test_basic.view_model  [SUCCESS 0 in 0.11s]
02:55:42  
02:55:42  Finished running 2 table models, 1 view model in 0 hours 0 minutes and 0.88 seconds (0.88s).
02:55:42  
02:55:42  Completed successfully
02:55:42  
02:55:42  Done. PASS=3 WARN=0 ERROR=0 SKIP=0 TOTAL=3
select count(*) as num_rows from `test17102121381979843801_test_basic`.`base`
Unread result found
F

=================================== FAILURES ===================================
________________ TestSimpleMaterializationsMyAdapter.test_base _________________

self = <test_basic.TestSimpleMaterializationsMyAdapter object at 0x108e611c0>
project = <dbt.tests.fixtures.project.TestProjInfo object at 0x1060fa9a0>

    def test_base(self, project):
    
        # seed command
        results = run_dbt(["seed"])
        # seed result length
        assert len(results) == 1
    
        # run command
        results = run_dbt()
        # run result length
        assert len(results) == 3
    
        # names exist in result nodes
        check_result_nodes_by_name(results, ["view_model", "table_model", "swappable"])
    
        # check relation types
        expected = {
            "base": "table",
            "view_model": "view",
            "table_model": "table",
            "swappable": "table",
        }
        check_relation_types(project.adapter, expected)
    
        # base table rowcount
        relation = relation_from_name(project.adapter, "base")
>       result = project.run_sql(f"select count(*) as num_rows from {relation}", fetch="one")

/Users/atwong/dbt-env/lib/python3.9/site-packages/dbt/tests/adapter/basic/test_base.py:66: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/Users/atwong/dbt-env/lib/python3.9/site-packages/dbt/tests/fixtures/project.py:431: in run_sql
    return run_sql_with_adapter(self.adapter, sql, fetch=fetch)
/Users/atwong/dbt-env/lib/python3.9/site-packages/dbt/tests/util.py:304: in run_sql_with_adapter
    return adapter.run_sql_for_tests(sql, fetch, conn)
/Users/atwong/dbt-env/lib/python3.9/site-packages/dbt/adapters/sql/impl.py:256: in run_sql_for_tests
    conn.handle.commit()
/Users/atwong/dbt-env/lib/python3.9/site-packages/mysql/connector/connection_cext.py:531: in commit
    self.handle_unread_result()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <mysql.connector.connection_cext.CMySQLConnection object at 0x10a4ccf40>
prepared = False

    def handle_unread_result(self, prepared: bool = False) -> None:
        """Check whether there is an unread result"""
        unread_result = self._unread_result if prepared is True else self.unread_result
        if self.can_consume_results:
            self.consume_results()
        elif unread_result:
>           raise InternalError("Unread result found")
E           mysql.connector.errors.InternalError: Unread result found

/Users/atwong/dbt-env/lib/python3.9/site-packages/mysql/connector/connection_cext.py:990: InternalError
=========================== short test summary info ============================
FAILED tests/functional/adapter/test_basic.py::TestSimpleMaterializationsMyAdapter::test_base
============================== 1 failed in 4.52s ===============================

dbt tutorial my_first_dbt_model doesn't work with StarRocks

Stuck at step 5. https://docs.getdbt.com/guides/manual-install?step=5

(dbt-env) atwong@Albert-CelerData jaffle_shop % dbt run
23:59:46  Running with dbt=1.6.2
23:59:46  Registered adapter: starrocks=1.4.2
23:59:46  Unable to do partial parsing because profile has changed
23:59:46  Found 2 models, 4 tests, 0 sources, 0 exposures, 0 metrics, 335 macros, 0 groups, 0 semantic models
23:59:46
23:59:46  Concurrency: 1 threads (target='dev')
23:59:46
23:59:46  1 of 2 START sql table model testing.my_first_dbt_model ........................ [RUN]
23:59:46  1 of 2 ERROR creating sql table model testing.my_first_dbt_model ............... [ERROR in 0.05s]
23:59:46  2 of 2 SKIP relation testing.my_second_dbt_model ............................... [SKIP]
23:59:46
23:59:46  Finished running 1 table model, 1 view model in 0 hours 0 minutes and 0.18 seconds (0.18s).
23:59:46
23:59:46  Completed with 1 error and 0 warnings:
23:59:46
23:59:46    Compilation Error in model my_first_dbt_model (models/example/my_first_dbt_model.sql)
  '>' not supported between instances of 'int' and 'str'

  > in macro starrocks__create_table_as (macros/materializations/models/table.sql)
  > called by macro create_table_as (macros/materializations/models/table/create_table_as.sql)
  > called by macro default__get_create_table_as_sql (macros/materializations/models/table/create_table_as.sql)
  > called by macro get_create_table_as_sql (macros/materializations/models/table/create_table_as.sql)
  > called by macro statement (macros/etc/statement.sql)
  > called by macro materialization_table_default (macros/materializations/models/table/table.sql)
  > called by model my_first_dbt_model (models/example/my_first_dbt_model.sql)
23:59:46
23:59:46  Done. PASS=0 WARN=0 ERROR=1 SKIP=1 TOTAL=2

macro 'dbt_macro__snapshot_check_all_get_existing_columns' takes not more than 2 argument(s)

CLIENT: Server listening on port 63194...
Received JSON data in run script
Running pytest with args: ['-p', 'vscode_pytest', '--rootdir=/Users/atwong/sandbox/dbt-starrocks', '/Users/atwong/sandbox/dbt-starrocks/tests/functional/adapter/test_basic.py::TestSnapshotCheckColsMyAdapter::test_snapshot_check_cols']
============================= test session starts ==============================
platform darwin -- Python 3.9.16, pytest-8.1.1, pluggy-1.4.0
rootdir: /Users/atwong/sandbox/dbt-starrocks
configfile: pytest.ini
plugins: dotenv-0.5.2
collected 1 item

tests/functional/adapter/test_basic.py F                                 [100%]

=================================== FAILURES ===================================
___________ TestSnapshotCheckColsMyAdapter.test_snapshot_check_cols ____________

self = <test_basic.TestSnapshotCheckColsMyAdapter object at 0x10699d2e0>
project = <dbt.tests.fixtures.project.TestProjInfo object at 0x102dd6c10>

    def test_snapshot_check_cols(self, project):
        # seed command
        results = run_dbt(["seed"])
        assert len(results) == 2
    
        # snapshot command
>       results = run_dbt(["snapshot"])

/Users/atwong/dbt-env/lib/python3.9/site-packages/dbt/tests/adapter/basic/test_snapshot_check_cols.py:44: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = ['snapshot', '--project-dir', '/private/var/folders/xn/_h8gfpfs3vv_m6gkxm8t0l380000gn/T/pytest-of-atwong/pytest-94/project0', '--profiles-dir', '/private/var/folders/xn/_h8gfpfs3vv_m6gkxm8t0l380000gn/T/pytest-of-atwong/pytest-94/profile0']
expect_pass = True

    def run_dbt(
        args: Optional[List[str]] = None,
        expect_pass: bool = True,
    ):
        # Ignore logbook warnings
        warnings.filterwarnings("ignore", category=DeprecationWarning, module="logbook")
    
        # reset global vars
        reset_metadata_vars()
    
        # The logger will complain about already being initialized if
        # we don't do this.
        log_manager.reset_handlers()
        if args is None:
            args = ["run"]
    
        print("\n\nInvoking dbt with {}".format(args))
        from dbt.flags import get_flags
    
        flags = get_flags()
        project_dir = getattr(flags, "PROJECT_DIR", None)
        profiles_dir = getattr(flags, "PROFILES_DIR", None)
        if project_dir and "--project-dir" not in args:
            args.extend(["--project-dir", project_dir])
        if profiles_dir and "--profiles-dir" not in args:
            args.extend(["--profiles-dir", profiles_dir])
    
        dbt = dbtRunner()
        res = dbt.invoke(args)
    
        # the exception is immediately raised to be caught in tests
        # using a pattern like `with pytest.raises(SomeException):`
        if res.exception is not None:
            raise res.exception
    
        if expect_pass is not None:
>           assert res.success == expect_pass, "dbt exit state did not match expected"
E           AssertionError: dbt exit state did not match expected

/Users/atwong/dbt-env/lib/python3.9/site-packages/dbt/tests/util.py:108: AssertionError
---------------------------- Captured stdout setup -----------------------------

=== Test project_root: /private/var/folders/xn/_h8gfpfs3vv_m6gkxm8t0l380000gn/T/pytest-of-atwong/pytest-94/project0
----------------------------- Captured stdout call -----------------------------


Invoking dbt with ['seed']
22:52:25  Running with dbt=1.6.2
22:52:25  Registered adapter: starrocks=1.4.2
22:52:25  Unable to do partial parsing because saved manifest not found. Starting full parse.
22:52:25  Found 3 snapshots, 2 seeds, 0 sources, 0 exposures, 0 metrics, 335 macros, 0 groups, 0 semantic models
22:52:25  
22:52:25  Concurrency: 1 threads (target='default')
22:52:25  
22:52:25  1 of 2 START seed file test17102839453554581547_test_basic.added ............... [RUN]
22:52:25  1 of 2 OK loaded seed file test17102839453554581547_test_basic.added ........... [INSERT 20 in 0.27s]
22:52:25  2 of 2 START seed file test17102839453554581547_test_basic.base ................ [RUN]
22:52:26  2 of 2 OK loaded seed file test17102839453554581547_test_basic.base ............ [INSERT 10 in 0.20s]
22:52:26  
22:52:26  Finished running 2 seeds in 0 hours 0 minutes and 0.53 seconds (0.53s).
22:52:26  
22:52:26  Completed successfully
22:52:26  
22:52:26  Done. PASS=2 WARN=0 ERROR=0 SKIP=0 TOTAL=2


Invoking dbt with ['snapshot']
22:52:26  Running with dbt=1.6.2
22:52:26  Registered adapter: starrocks=1.4.2
22:52:26  Found 3 snapshots, 2 seeds, 0 sources, 0 exposures, 0 metrics, 335 macros, 0 groups, 0 semantic models
22:52:26  
22:52:26  Concurrency: 1 threads (target='default')
22:52:26  
22:52:26  1 of 3 START snapshot test17102839453554581547_test_basic.cc_all_snapshot ...... [RUN]
22:52:26  1 of 3 ERROR snapshotting test17102839453554581547_test_basic.cc_all_snapshot .. [ERROR in 0.05s]
22:52:26  2 of 3 START snapshot test17102839453554581547_test_basic.cc_date_snapshot ..... [RUN]
22:52:26  2 of 3 ERROR snapshotting test17102839453554581547_test_basic.cc_date_snapshot . [ERROR in 0.02s]
22:52:26  3 of 3 START snapshot test17102839453554581547_test_basic.cc_name_snapshot ..... [RUN]
22:52:26  3 of 3 ERROR snapshotting test17102839453554581547_test_basic.cc_name_snapshot . [ERROR in 0.02s]
22:52:26  
22:52:26  Finished running 3 snapshots in 0 hours 0 minutes and 0.12 seconds (0.12s).
22:52:26  
22:52:26  Completed with 3 errors and 0 warnings:
22:52:26  
22:52:26    Compilation Error in snapshot cc_all_snapshot (snapshots/cc_all_snapshot.sql)
  macro 'dbt_macro__snapshot_check_all_get_existing_columns' takes not more than 2 argument(s)
  
  > in macro snapshot_check_strategy (macros/materializations/snapshots/strategies.sql)
  > called by macro materialization_snapshot_starrocks (macros/materializations/snapshot/snapshot.sql)
  > called by snapshot cc_all_snapshot (snapshots/cc_all_snapshot.sql)
22:52:26  
22:52:26    Compilation Error in snapshot cc_date_snapshot (snapshots/cc_date_snapshot.sql)
  macro 'dbt_macro__snapshot_check_all_get_existing_columns' takes not more than 2 argument(s)
  
  > in macro snapshot_check_strategy (macros/materializations/snapshots/strategies.sql)
  > called by macro materialization_snapshot_starrocks (macros/materializations/snapshot/snapshot.sql)
  > called by snapshot cc_date_snapshot (snapshots/cc_date_snapshot.sql)
22:52:26  
22:52:26    Compilation Error in snapshot cc_name_snapshot (snapshots/cc_name_snapshot.sql)
  macro 'dbt_macro__snapshot_check_all_get_existing_columns' takes not more than 2 argument(s)
  
  > in macro snapshot_check_strategy (macros/materializations/snapshots/strategies.sql)
  > called by macro materialization_snapshot_starrocks (macros/materializations/snapshot/snapshot.sql)
  > called by snapshot cc_name_snapshot (snapshots/cc_name_snapshot.sql)
22:52:26  
22:52:26  Done. PASS=0 WARN=0 ERROR=3 SKIP=0 TOTAL=3
=========================== short test summary info ============================
FAILED tests/functional/adapter/test_basic.py::TestSnapshotCheckColsMyAdapter::test_snapshot_check_cols
============================== 1 failed in 1.60s ===============================
Finished running tests!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.