Code Monkey home page Code Monkey logo

aisdb's People

Contributors

dependabot[bot] avatar gabrielspadon avatar jaykumarr avatar jinnkunn avatar matt24smith avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

aisdb's Issues

AISdb paper revision

The paper that introduces AISdb needs to be revised based on the feedback provided by the co-authors to ensure it is up-to-date in the arXiv repository. The comments were shared via email and also through comments added in the PDF. It should be noted that the length of the paper may need to be reduced as some venues may not accept lengthy papers.

Spire Weather Data

@Jinnkunn, after the meeting with Spire, you need to check how we can access the new information they are providing us and how we can adapt AISdb to store that information and return it to the user once they query the trajectory data. I believe Matt has pre-prepared some tools that would allow it beforehand, so this is mainly related to checking the current documentation and making tests locally before reflecting on the master branch.

AISdb marinetraffic scraper not working

The scraper of marinetraffic.com data by MMSI codes is not working, giving timeout error even when I can see that the MMSI exists in the Gecko driver, using the DEBUG variable set to "1".

Steps to reproduce the behavior:

import os
from aisdb.webdata.marinetraffic import VesselInfo
os.environ["DEBUG"] = "1"
output_db_filepath = 'PATH TO SQLITE DB FILE'

mmsi = [
    372351000,
    373416000,
    477003800,
    477282400,
    477519900,
    477593700,
    477831200,
    477864000,
    477905000,
    563626000,
    565715000
]

scrapper = VesselInfo(output_db_filepath)
scrapper.vessel_info_callback(mmsi)
  • OS: Windows 11
  • Gecko driver
  • Using pip virtual environment and the last version of AISdb
  • Python 3.12.3

Support for s390x and ppc64le

Currently, AISdb is not being developed for the s390x and ppc64le architectures. This is due to a dependency chain where version 0.16.20 of 'ring' is required by 'rustls', which in turn is needed by 'tungstenite' for the receiver component. The latest 'ring' release, version 0.17.0, is expected to provide support for both s390x and ppc64le. We plan to address this compatibility issue promptly following the release of the updated dependencies.

issue related to ring:

The most recent version of rustls (0.21.8) now incorporates ring 0.17.0 as per its GitHub repository. However, there is a version mismatch since tungstenite-rs employs rustls 0.21.0, which depends on the outdated ring 0.16.20. This versioning conflict is the root cause for the current lack of support for s390x and ppc64le architectures in AISdb. The issue will be remedied once tungstenite-rs updates to a rustls version that utilizes ring 0.17.0.

GitBook documentation

@AISViz/maintainers, we need to migrate the documentation and the Python API references that are current on the ReadTheDocs platform to the GitBook platform. I hope this will make the documentation part more straightforward and help us to produce more content that can show users how to use the tools we developed and will be developing in the future. The GitBook account and directory for AISdb documentation have already been created, but all the links and references are broken. We probably need to fix them by hand. This tends to be a lot of work initially but probably will save us time later on.

Query Results Suggestion

I have a suggestion to improve the behavior of AISdb when a query returns no results. Currently, AISdb does not return an empty set like an RDBMS would; instead, it requires users to add an extra try-catch to handle the situation on their side. To make things more straightforward and intuitive, I propose that AISdb should return an empty set when there are no results. This way, users would not have to write additional code to handle this scenario, and the behavior would be consistent with industry standards.

Traceback (most recent call last):

  File "/usr/local/lib/python3.9/dist-packages/IPython/core/interactiveshell.py", line 3553, in run_code
    exec(code_obj, self.user_global_ns, self.user_ns)

  File "<ipython-input-55-4ae9a9be14b0>", line 1, in <cell line: 1>
    for all_tracks in get_tracks(hours2batch=168):

  File "<ipython-input-54-7b2bb655fd9e>", line 36, in get_tracks
    yield list(tracks)

  File "/usr/local/lib/python3.9/dist-packages/aisdb/interp.py", line 29, in interp_time
    for track in tracks:

  File "/usr/local/lib/python3.9/dist-packages/aisdb/track_gen.py", line 361, in encode_greatcircledistance
    for track in tracks:

  File "/usr/local/lib/python3.9/dist-packages/aisdb/track_gen.py", line 154, in split_timedelta
    for track in tracks:

  File "/usr/local/lib/python3.9/dist-packages/aisdb/track_gen.py", line 129, in TrackGen
    for rows in rowgen:

  File "/usr/local/lib/python3.9/dist-packages/aisdb/database/dbqry.py", line 258, in gen_qry
    raise SyntaxError(f'no results for query!\n{qry}')

  File "<string>", line unknown
SyntaxError: no results for query!
  SELECT 
    d.mmsi, 
    d.time, 
    d.longitude,
    d.latitude,
    d.sog,
    d.cog
  FROM esrf_hydrophone_01.ais_201508_dynamic AS d
  WHERE
d.time >= 1438387200 AND
    d.time <= 1438992000
ORDER BY 1,2

Script for data importation

Please create a script that can import data from Spire into the Postgres database. The script should establish connections to the required services such as Wasabi and Postgres, create the databases in the AISdb structure, and commit the results once the insertion is validated. If possible, the process should be automated as a background service in BigData 1, using crontabs or other suitable services that can run without user interference. In the case of automation, detailed documentation should be provided for future maintainers on how to run/debug the service.

HDD Checking

@Jinnkunn, since you have already received the HDDs from the office, it is crucial to check the contents of each one and share that information with us. It is also important to know their capacity and the amount of data available in each of them. If these hard drives store any raw AIS data, we may need to examine that data at some point to determine if any further information from the AIS messages can be extracted and added to our database. It would be helpful to know the attributes of these AIS messages, not just the ones we already have, as well as their prevalence over the years and by vessel type. This information can assist us in the long run, allowing us to create more analytical papers.

GitHub hooks and other integrations

@JayKumarr, as we migrated to GitHub, I need your help to set up all the tiny details of the repository related to automation regarding AISdb building, testing, and sharing. @Jinnkunn mentioned that Maturin is integrated with GitHub, so we need to have it all settled up before we make the repository publicly available.

Github Action Workflow for AISdb-client

To enhance the development workflow and guarantee the consistent deployment of the most recent AISdb-client updates to PyPI, we suggest establishing a GitHub Actions Workflow. This automation will facilitate the build and release process, seamlessly publishing the package to PyPI with each new commit to the main branch.

Interpolation

@JayKumarr and @Jinnkunn, whenever you can, could you please add two other algorithms for interpolation? One is the great-circle interpolation and the other is interpolation with polylines for better handling curves. Thanks!

Open Source Data Importation

@JayKumarr, as we are downsizing AISdb to only work with the data and not to keep offering outsourced services, it would be great if we could show people how they can use AISdb with open-source data. The Spire data we have is not publicly sharable, but we do have data from here that can be freely used https://coast.noaa.gov/digitalcoast/contributing-partners/marinecadastregov.html. Could you work on a way to show people, either with a script or with Python code how can we ingest that data in AISdb databases for later use?

Demo with Canadian MPAs

The AISViz project requires us to work with Marine Protected Areas (MPA). I suggest we integrate pre-prepared datasets for each MPA near Atlantic Canada into the AISdb repository to make it public. The MPAs of interest are Banc-des-Américains, Basin Head, Eastport, Gilbert Bay, The Gully, Laurentian Channel, Musquash Estuary, and St. Anns Bank. You can find more information about them at https://www.dfo-mpo.gc.ca/oceans/mpa-zpm/index-eng.html. To ensure that we have the most comprehensive data possible, I suggest delineating a bounding box that extends 20 kilometers beyond the defined boundaries of these MPAs. You can find the official bounding boxes at https://open.canada.ca/data/en/dataset/a1e18963-25dd-4219-a33f-1a38c4971250. The goal of this bounding box is to capture vessel traffic data from July 2022 to January 2023, spanning over six months. By incorporating this additional data, people interested in AISdb can run examples near these protected areas.

Ship Type = 0 in SQLite while loading CSV

decoding the CSV to create the SQLite. When we open the database, there are mainy five tables, among them one contains metadata for vessels. The ship type, draught, IMO are available in the data but not inserted in the database.

dbpath_ = 'D:/AIS_2020_01_01/data/AIS_2020_01_01_aisdb_sample.db'
with aisdb.SQLiteDBConn(dbpath_) as dbconn:
            aisdb.decode_msgs(filepaths=["D:/AIS_2020_01_01/data/AIS_2020_01_01_aisdb_sample.csv"],
                              dbconn=dbconn,
                              source='Testing',
                              verbose=True)

Missing Tiffs

Search for distance-from-shore.tif and distance-from-port-v20201104.tif in AISViz project on GitHub. These filenames were found in the test case code but there are no files on the Gitlab/Github.

The test cases check the working of marinetraffic related code

Package failing to build from source

I have created a fresh venv and installed aisdb-client using this command:

# installing the rust toolchain may be required
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh

# create a virtual python environment and install maturin
python -m venv env_ais
source ./env_ais/bin/activate
python -m pip install --upgrade maturin

# clone source and navigate to the package root
git clone http://git-dev.cs.dal.ca/meridian/aisdb.git](https://github.com/AISViz/aisdb-client
cd aisdb

# install AISDB
maturin develop --release --extras=test,docs

The package fails to build with maturin for some reason.

Break Down the Large Export Task into Manageable Sub-Jobs

Our current export process experiences significant inefficiencies, particularly when handling large datasets (e.g., exporting data spanning an entire year).

This inefficiency manifests as prolonged periods of inactivity (sleep times) throughout the export process, severely impacting overall efficiency. A practical examination revealed that exporting a year's data in one single job is considerably less efficient than dividing it into twelve monthly exports. I got this observation from the data exporting task for Bill's data request.

This discrepancy suggests that splitting large export tasks into smaller, more manageable jobs could significantly enhance performance. Consequently, a review and potential modification of the export code may be necessary to optimize efficiency and reduce operational delays.

CSV file importing with Rust is not working properly

I noticed that in some circumstances, there's a type conversion issue that comes from converting text to int/bigint. I am currently investigating the origin of the issue as previous fixes have not resulted in any positive solution.

Integration with TensorFlow and PyTorch

We should integrate AISdb with TensorFlow and PyTorch by creating a custom, high-speed data loader. The goal is to simplify the process of setting up a database, importing the data loader, and training sequence-to-sequence models with the provided data. By integrating AISdb with TensorFlow and PyTorch, users will be able to directly utilize the machine-learning libraries within the AISdb environment. This will greatly facilitate the development and training of models, particularly those that require sequence-to-sequence learning. To develop the custom data loader, I suggest exploring the data-loader functionalities already provided by TensorFlow (https://www.tensorflow.org/api_docs/python/tf/data/Dataset) and PyTorch (https://pytorch.org/docs/stable/data.html). These libraries already offer robust and efficient data loading and preprocessing capabilities that can be easily tailored to general needs and specifications. This can be solved either by embedding these data-loading functions into AISdb using wrappers, or it can be done through tutorials on GitBook that will show the users how to join both libraries in an efficient and correct way.

Error when pip insall aisdb

OS: macOS 14.5

pip install aisdb
Collecting aisdb
Using cached aisdb-1.7.2.tar.gz (313 kB)
Installing build dependencies ... error
error: subprocess-exited-with-error

× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> [160 lines of output]
Collecting maturin>=1.0
Using cached maturin-1.6.0-py3-none-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl.metadata (18 kB)
Collecting numpy
Using cached numpy-1.26.4-cp39-cp39-macosx_11_0_arm64.whl.metadata (61 kB)
Collecting wheel
Using cached wheel-0.43.0-py3-none-any.whl.metadata (2.2 kB)
Collecting patchelf
Using cached patchelf-0.17.2.1.tar.gz (167 kB)
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status 'done'
Preparing metadata (pyproject.toml): started
Preparing metadata (pyproject.toml): finished with status 'done'
Collecting tomli>=1.1.0 (from maturin>=1.0)
Using cached tomli-2.0.1-py3-none-any.whl.metadata (8.9 kB)
Using cached maturin-1.6.0-py3-none-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl (15.8 MB)
Using cached numpy-1.26.4-cp39-cp39-macosx_11_0_arm64.whl (14.0 MB)
Using cached wheel-0.43.0-py3-none-any.whl (65 kB)
Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Building wheels for collected packages: patchelf
Building wheel for patchelf (pyproject.toml): started
Building wheel for patchelf (pyproject.toml): finished with status 'error'
error: subprocess-exited-with-error

    × Building wheel for patchelf (pyproject.toml) did not run successfully.
    │ exit code: 1
    ╰─> [126 lines of output]
  
  
        --------------------------------------------------------------------------------
        -- Trying 'Ninja' generator
        --------------------------------
        ---------------------------
        ----------------------
        -----------------
        ------------
        -------
        --
        CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required):
          Compatibility with CMake < 3.5 will be removed from a future version of
          CMake.
  
          Update the VERSION argument <min> value or use a ...<max> suffix to tell
          CMake that the project does not need compatibility with older versions.
  
        Not searching for unused variables given on the command line.
  
        CMake Error: CMake was unable to find a build program corresponding to "Ninja".  CMAKE_MAKE_PROGRAM is not set.  You probably need to select a different build tool.
        -- Configuring incomplete, errors occurred!
        --
        -------
        ------------
        -----------------
        ----------------------
        ---------------------------
        --------------------------------
        -- Trying 'Ninja' generator - failure
        --------------------------------------------------------------------------------
  
  
  
        --------------------------------------------------------------------------------
        -- Trying 'Unix Makefiles' generator
        --------------------------------
        ---------------------------
        ----------------------
        -----------------
        ------------
        -------
        --
        CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required):
          Compatibility with CMake < 3.5 will be removed from a future version of
          CMake.
  
          Update the VERSION argument <min> value or use a ...<max> suffix to tell
          CMake that the project does not need compatibility with older versions.
  
        Not searching for unused variables given on the command line.
  
        -- The C compiler identification is AppleClang 15.0.0.15000309
        -- Detecting C compiler ABI info
        -- Detecting C compiler ABI info - done
        -- Check for working C compiler: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/cc - skipped
        -- Detecting C compile features
        -- Detecting C compile features - done
        -- The CXX compiler identification is AppleClang 15.0.0.15000309
        -- Detecting CXX compiler ABI info
        -- Detecting CXX compiler ABI info - done
        -- Check for working CXX compiler: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/c++ - skipped
        -- Detecting CXX compile features
        -- Detecting CXX compile features - done
        -- Configuring done (0.3s)
        -- Generating done (0.0s)
        -- Build files have been written to: /private/var/folders/zh/thckt_q533l3r5vvgrrcfbv80000gn/T/pip-install-88ufb1qj/patchelf_426e789d7c4e4a67b6ce440c91e7d5a7/_cmake_test_compile/build
        --
        -------
        ------------
        -----------------
        ----------------------
        ---------------------------
        --------------------------------
        -- Trying 'Unix Makefiles' generator - success
        --------------------------------------------------------------------------------
  
        Configuring Project
          Working directory:
            /private/var/folders/zh/thckt_q533l3r5vvgrrcfbv80000gn/T/pip-install-88ufb1qj/patchelf_426e789d7c4e4a67b6ce440c91e7d5a7/_skbuild/macosx-14.0-arm64-3.9/cmake-build
          Command:
            /opt/homebrew/bin/cmake /private/var/folders/zh/thckt_q533l3r5vvgrrcfbv80000gn/T/pip-install-88ufb1qj/patchelf_426e789d7c4e4a67b6ce440c91e7d5a7 -G 'Unix Makefiles' --no-warn-unused-cli -DCMAKE_INSTALL_PREFIX:PATH=/private/var/folders/zh/thckt_q533l3r5vvgrrcfbv80000gn/T/pip-install-88ufb1qj/patchelf_426e789d7c4e4a67b6ce440c91e7d5a7/_skbuild/macosx-14.0-arm64-3.9/cmake-install -DPYTHON_VERSION_STRING:STRING=3.9.6 -DSKBUILD:INTERNAL=TRUE -DCMAKE_MODULE_PATH:PATH=/private/var/folders/zh/thckt_q533l3r5vvgrrcfbv80000gn/T/pip-build-env-e51ae7bo/overlay/lib/python3.9/site-packages/skbuild/resources/cmake -DPYTHON_EXECUTABLE:PATH=/Users/myname/Desktop/aisdb_demo/venv/bin/python -DPYTHON_INCLUDE_DIR:PATH=/Applications/Xcode.app/Contents/Developer/Library/Frameworks/Python3.framework/Versions/3.9/Headers -DPYTHON_LIBRARY:PATH=/Applications/Xcode.app/Contents/Developer/Library/Frameworks/Python3.framework/Versions/3.9/lib/libpython3.9.dylib -DPython_EXECUTABLE:PATH=/Users/myname/Desktop/aisdb_demo/venv/bin/python -DPython_ROOT_DIR:PATH=/Users/myname/Desktop/aisdb_demo/venv -DPython_FIND_REGISTRY:STRING=NEVER -DPython_INCLUDE_DIR:PATH=/Applications/Xcode.app/Contents/Developer/Library/Frameworks/Python3.framework/Versions/3.9/Headers -DPython3_EXECUTABLE:PATH=/Users/myname/Desktop/aisdb_demo/venv/bin/python -DPython3_ROOT_DIR:PATH=/Users/myname/Desktop/aisdb_demo/venv -DPython3_FIND_REGISTRY:STRING=NEVER -DPython3_INCLUDE_DIR:PATH=/Applications/Xcode.app/Contents/Developer/Library/Frameworks/Python3.framework/Versions/3.9/Headers -DCMAKE_OSX_DEPLOYMENT_TARGET:STRING=14.0 -DCMAKE_OSX_ARCHITECTURES:STRING=arm64 -DCMAKE_BUILD_TYPE:STRING=Release
  
        Not searching for unused variables given on the command line.
        -- The C compiler identification is AppleClang 15.0.0.15000309
        -- The CXX compiler identification is AppleClang 15.0.0.15000309
        -- Detecting C compiler ABI info
        -- Detecting C compiler ABI info - done
        -- Check for working C compiler: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/cc - skipped
        -- Detecting C compile features
        -- Detecting C compile features - done
        -- Detecting CXX compiler ABI info
        -- Detecting CXX compiler ABI info - done
        -- Check for working CXX compiler: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/c++ - skipped
        -- Detecting CXX compile features
        -- Detecting CXX compile features - done
        -- Configuring done (0.3s)
        -- Generating done (0.0s)
        -- Build files have been written to: /private/var/folders/zh/thckt_q533l3r5vvgrrcfbv80000gn/T/pip-install-88ufb1qj/patchelf_426e789d7c4e4a67b6ce440c91e7d5a7/_skbuild/macosx-14.0-arm64-3.9/cmake-build
        [ 11%] Creating directories for 'build_patchelf'
        [ 22%] No download step for 'build_patchelf'
        [ 33%] No update step for 'build_patchelf'
        [ 44%] Performing patch step for 'build_patchelf'
        ./bootstrap.sh: line 2: autoreconf: command not found
        make[2]: *** [build_patchelf-prefix/src/build_patchelf-stamp/build_patchelf-patch] Error 127
        make[1]: *** [CMakeFiles/build_patchelf.dir/all] Error 2
        make: *** [all] Error 2
        Traceback (most recent call last):
          File "/private/var/folders/zh/thckt_q533l3r5vvgrrcfbv80000gn/T/pip-build-env-e51ae7bo/overlay/lib/python3.9/site-packages/skbuild/setuptools_wrap.py", line 674, in setup
            cmkr.make(make_args, install_target=cmake_install_target, env=env)
          File "/private/var/folders/zh/thckt_q533l3r5vvgrrcfbv80000gn/T/pip-build-env-e51ae7bo/overlay/lib/python3.9/site-packages/skbuild/cmaker.py", line 697, in make
            self.make_impl(clargs=clargs, config=config, source_dir=source_dir, install_target=install_target, env=env)
          File "/private/var/folders/zh/thckt_q533l3r5vvgrrcfbv80000gn/T/pip-build-env-e51ae7bo/overlay/lib/python3.9/site-packages/skbuild/cmaker.py", line 742, in make_impl
            raise SKBuildError(msg)
  
        An error occurred while building with CMake.
          Command:
            /opt/homebrew/bin/cmake --build . --target install --config Release --
          Install target:
            install
          Source directory:
            /private/var/folders/zh/thckt_q533l3r5vvgrrcfbv80000gn/T/pip-install-88ufb1qj/patchelf_426e789d7c4e4a67b6ce440c91e7d5a7
          Working directory:
            /private/var/folders/zh/thckt_q533l3r5vvgrrcfbv80000gn/T/pip-install-88ufb1qj/patchelf_426e789d7c4e4a67b6ce440c91e7d5a7/_skbuild/macosx-14.0-arm64-3.9/cmake-build
        Please check the install target is valid and see CMake's output for more information.
  
        [end of output]
  
    note: This error originates from a subprocess, and is likely not a problem with pip.
    ERROR: Failed building wheel for patchelf
  Failed to build patchelf
  ERROR: Could not build wheels for patchelf, which is required to install pyproject.toml-based projects
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

Showing error while loading any file but database is successfully created.

By following new Tutorial [https://app.gitbook.com/o/EkFPgNRPUe5HDRcI8Zdo/s/hw3TIQxBDnQv4qlOYB74/working-with-your-ais-data] we can create a database, however in console it shows 'error in processing'. Even though, when importing default test_data.csv in aisdb > testdata , it also shows error however the database is successfully created.

Integration with PTRAIL for feature extraction

AISdb is a system that is used to store and query sequential spatio-temporal data, specifically trajectory data. However, in order to perform statistical and machine learning tasks, we need additional information that is not provided by AIS alone. I strongly believe that integrating PTRAIL (https://github.com/YakshHaranwala/PTRAIL) with AISdb through either a code integration or a new tutorial on GitBook would greatly benefit users. PTRAIL is a Python package that can extract features from trajectory-like data, and AIS data can be effectively utilized for machine learning and deep learning tasks with the help of PTRAIL. This package can extract a variety of features out of the box and make modeling and learning tasks easier to perform. Additionally, PTRAIL can work with any type of trajectory data, further increasing its usefulness. PTRAIL was developed under Memorial University and was published in SoftwareX Journal, they do have a bunch of examples that we would benefit from having them working with AIS data: https://github.com/YakshHaranwala/PTRAIL/tree/main/examples.

Memory Issue for data Extraction

@Jinnkunn, I modified your script to extract monthly data from PostgreSQL and store it in SQLite. So far, I have been able to extract data up until April 2020. However, the script fails to extract data for the following months due to insufficient memory. I am wondering if there are ways to optimize the script to be more efficient or use the disk more effectively to avoid memory issues. Can you please review the script and suggest changes so that I can proceed with the data extraction required for Alexandra? There is an ongoing issue that I am currently addressing in relation to the problem.

I suggest committing data insertion every 24 hours or weekly, deleting unnecessary variables during the pipeline, and explicitly calling the garbage collector in Python. Please update me on the progress.

  • Go to /meridian/sqlite-databases/ there you will find the .py file for data extraction.
  • If you need to use AISdb, you can use "source AISdb/*/activate" on the same folder.

Documentation Tutor Chatbot for AISdb

I suggest the development and integration of a chatbot that is trained with LangChain and an open-source Language Model (LLM). This chatbot will be created to act as a tutor for the AISdb documentation. Its purpose is to make it easier for new users to onboard and provide ongoing support to help them navigate the various functionalities of AISdb. The chatbot will be designed to provide code examples, assist in debugging, and offer guidance based on the user's needs. The effectiveness of this tool will depend on the quality and completeness of the existing documentation. However, given the wealth of open-source tools and pre-existing code available, the potential benefits of this initiative are substantial. To develop the chatbot, I suggest exploring models available on Hugging Face's Model Hub. These models have been trained on various data and could provide a solid foundation for our chatbot.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.