Code Monkey home page Code Monkey logo

zlib's Introduction

zlib CI Hackage

Compression and decompression in the gzip and zlib format

This package provides a pure interface for compressing and decompressing streams of data represented as lazy ByteStrings. It uses the zlib C library so it has high performance. It supports the zlib, gzip and raw compression formats.

It provides a convenient high level API suitable for most tasks and for the few cases where more control is needed it provides access to the full zlib feature set.

zlib's People

Contributors

amesgen avatar andreasabel avatar angerman avatar bodigrim avatar dcoutts avatar donsbot avatar emilypi avatar facundominguez avatar hasufell avatar hsenag avatar hvr avatar kleidukos avatar kolmodin avatar ljli avatar minoru avatar ndmitchell avatar phadej avatar vmchale avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

zlib's Issues

Add support for flushing compression streams

At the API level this can be done by e.g. providing an additional field in CompressInputRequired
(this is what e.g. https://github.com/hvr/lzma does):

diff --git a/Codec/Compression/Zlib/Internal.hs b/Codec/Compression/Zlib/Internal.hs
index 74519c7..d5b851b 100644
--- a/Codec/Compression/Zlib/Internal.hs
+++ b/Codec/Compression/Zlib/Internal.hs
@@ -371,6 +371,7 @@ foldDecompressStreamWithInput chunk end err = \s lbs ->
 --
 data CompressStream m =
      CompressInputRequired {
+         compressFlush       :: m (CompressStream m),
          compressSupplyInput :: S.ByteString -> m (CompressStream m)
        }

which then results in a Z_SYNC_FLUSH being requested from zlib, and resulting in the CompressOutputAvailable state being returned until all input data has been flushed out as compressed data.

Quoting the zlib documentation on deflate(_ ,Z_SYNC_FLUSH):

If the parameter flush is set to Z_SYNC_FLUSH, all pending output is flushed to the output buffer and the output is aligned on a byte boundary, so that the decompressor can get all input data available so far. (In particular avail_in is zero after the call if enough output space has been provided before the call.) Flushing may degrade compression for some compression algorithms and so it should be used only when necessary. This completes the current deflate block and follows it with an empty stored block that is three bits plus filler bits to the next byte, followed by four bytes (00 00 ff ff).

This would allow libraries such as io-streams to migrate from zlib-bindings to zlib

haskell library fails tests against C library zlib-1.2.11 includeing identit test: Exception: 'user error (Codec.Compression.Zlib: stream error)'

Original report: https://bugs.gentoo.org/show_bug.cgi?id=613532
I can reproduce it on my machine as well thus can aid in debugging.

$ ./setup test --show-details=streaming
Running 1 test suites...
Test suite tests: RUNNING...
zlib tests
  property tests
    decompress . compress = id (standard):            FAIL (0.29s)
      *** Failed! Exception: 'user error (Codec.Compression.Zlib: stream error)' (after 1 test): 
      GZip
      CompressParams {compressLevel = CompressionLevel 4, compressMethod = Deflated, compressWindowBits = WindowBits 8, compressMemoryLevel = MemoryLevel 5, compressStrategy = DefaultStrategy, compressBufferSize = 121850, compressDictionary = Nothing}
      DecompressParams {decompressWindowBits = WindowBits 12, decompressBufferSize = 44823, decompressDictionary = Nothing, decompressAllMembers = False}
      ""
      Use --quickcheck-replay '0 TFGenR 000013DCC9212C1D000000001DCD6500000000000000E1EA00000003B9ACA000 0 1 1 0' to reproduce.

Build failure on GHC 8.x (macOS 10.14): ld: unknown option: -no_fixup_chains

I am getting build errors on macOS 10.14 which I have not seen before. They happen with any GHC 8 version and disappear with GHC 9:

$ cabal build -w ghc-8.10.7
...
Preprocessing library for zlib-0.6.3.0..
linking /Users/abel/bin/src/zlib/dist-newstyle/build/x86_64-osx/ghc-8.10.7/zlib-0.6.3.0/build/Codec/Compression/Zlib/Stream_hsc_make.o failed (exit code 1)
rsp file was: "/Users/abel/bin/src/zlib/dist-newstyle/build/x86_64-osx/ghc-8.10.7/zlib-0.6.3.0/build/Codec/Compression/Zlib/hsc2hscall34067-2.rsp"
command was: /usr/bin/gcc /Users/abel/bin/src/zlib/dist-newstyle/build/x86_64-osx/ghc-8.10.7/zlib-0.6.3.0/build/Codec/Compression/Zlib/Stream_hsc_make.o /Users/abel/bin/src/zlib/dist-newstyle/build/x86_64-osx/ghc-8.10.7/zlib-0.6.3.0/build/Codec/Compression/Zlib/Stream_hsc_utils.o -o /Users/abel/bin/src/zlib/dist-newstyle/build/x86_64-osx/ghc-8.10.7/zlib-0.6.3.0/build/Codec/Compression/Zlib/Stream_hsc_make \
  --target=x86_64-apple-darwin -Wl,-no_fixup_chains -L/usr/local/opt/icu4c/lib -L/usr/local/opt/libxml2/lib -lz -L/usr/local/lib/ghc-8.10.7/bytestring-0.10.12.0 -L/usr/local/lib/ghc-8.10.7/deepseq-1.4.4.0 -L/usr/local/lib/ghc-8.10.7/array-0.5.4.0 -L/usr/local/lib/ghc-8.10.7/base-4.14.3.0 -liconv -L/usr/local/lib/ghc-8.10.7/integer-gmp-1.0.3.0 -L/usr/local/lib/ghc-8.10.7/ghc-prim-0.6.1 -L/usr/local/lib/ghc-8.10.7/rts -lm -ldl
error: ld: unknown option: -no_fixup_chains
clang: error: linker command failed with exit code 1 (use -v to see invocation)

The latest macOS virtual environment does not have this problem: https://github.com/andreasabel/zlib/actions/runs/4194386615/jobs/7272477723#step:5:23
There is only a warning:

ld: warning: -undefined dynamic_lookup may not work with chained fixups

I am putting this up here, maybe someone has an idea how to work around this.

Segfault or corruption when combined with tar

Given the Hoogle tarball from http://hackage.haskell.org/packages/hoogle.tar.gz saved down, and the attached code, compiling with -threaded -with-rtsopts=-N4 on GHC 8.0.1 gives random data corruption. Usually it gives corrupt hashes followed by an invalid tar format error. Sometimes it gives segfaults. I have been unable to easily reproduce with either tar or zlib in isolation - my best guess is that zlib is producing buffers which aren't pinned enough, which are being collected early, and then tar chokes on them.

I observe this on other tarballs as well, but not index.tar.gz from Hackage. Hoogle just happens to be a publicly available tarball that shows the issue.

Test.zip

Release version of zlib for GHC 9.0

This issue states the obvious (a hackage release of zlib is needed for use with ghc 9.0). I raised it to reference it from projects that depend on zlib.

For what it's worth, I submitted PR #41 with some of the changes to the cabal file needed for a new release.

Cannot find foreign library (Ubuntu, with zlib1g-dev installed)

I am attempting to build on Ubuntu but no matter how I try I can't seem to get zlib to find the native library. Both zlib1g and zlib1g-dev are installed.

nuttycom@dominion:~/oss/zlib (master)λ cabal new-build
Build profile: -w ghc-8.6.5 -O1
In order, the following will be built (use -v for more details):
 - zlib-0.6.2.2 (lib) (first run)
Configuring library for zlib-0.6.2.2..
cabal: Missing dependency on a foreign library:
* Missing (or bad) header file: zlib.h
* Missing (or bad) C library: z
This problem can usually be solved by installing the system package that
provides this library (you may need the "-dev" version). If the library is
already installed but in a non-standard location then you can use the flags
--extra-include-dirs= and --extra-lib-dirs= to specify where it is.If the
library file does exist, it may contain errors that are caught by the C
compiler at the preprocessing stage. In this case you can re-run configure
with the verbosity flag -v3 to see the error messages.
If the header file does exist, it may contain errors that are caught by the C
compiler at the preprocessing stage. In this case you can re-run configure
with the verbosity flag -v3 to see the error messages.

nuttycom@dominion:~/oss/zlib (master)λ sudo apt install zlib1g-dev
Reading package lists... Done
Building dependency tree
Reading state information... Done
zlib1g-dev is already the newest version (1:1.2.11.dfsg-2ubuntu1).

ghc.exe: could not execute: C:/GitLabRunner/builds/2WeHDSFP/0/ghc/ghc/inplace/mingw/bin/ld.exe

I need to build zlib on a windows machine. Since ghc 8.8 isn't functional on windows, I have to use ghc 8.10. Using stack with resolver: nightly-2020-09-19, I get the following error message:

❯ stack install zlib-0.6.2.2
zlib> configure
zlib> Configuring zlib-0.6.2.2...
zlib> build
zlib> Preprocessing library for zlib-0.6.2.2..
zlib> Building library for zlib-0.6.2.2..
zlib> [1 of 5] Compiling Codec.Compression.Zlib.Stream
zlib> ghc.exe: could not execute: C:/GitLabRunner/builds/2WeHDSFP/0/ghc/ghc/inplace/mingw/bin/ld.exe

--  While building package zlib-0.6.2.2 using:
      C:\sr\setup-exe-cache\x86_64-windows\Cabal-simple_Z6RU0evB_3.2.0.0_ghc-8.10.2.exe --builddir=.stack-work\dist\a3a5fe88 build --ghc-options " -fdiagnostics-color=always"
    Process exited with code: ExitFailure 1
hjo20125@C7865606  D:\git\yesodwebSimple   master ≢ +2 ~0 -0 !                                             [10:39]
❯

I really wonder what should be there in C:/GitLabRunner ?

Missing dependency under CentOS

Adding zlib to a project's .cabal file leads to:

Configuring zlib-0.6.1.2...
Cabal-simple_mPHDZzAJ_1.24.2.0_ghc-8.0.2: Missing dependency on a foreign
library:
* Missing (or bad) header file: zlib.h
* Missing C library: z

OS: CentOS release 6.8

Drop GHC 6 ?

The cabal file contains branches for GHC 6:

zlib/zlib.cabal

Lines 87 to 91 in ee937e3

if impl(ghc < 7)
default-language: Haskell98
default-extensions: PatternGuards
else
default-language: Haskell2010

Are these tested regularly and if yes, how?
Should we rather drop GHC 6 and stick to GHC >= 7 which is tested by CI?

haddock fails to build with ghc-8.0

This is low priority but I still build some of my projects with ghc-8.0 in github for example: pagure-hs: I still need to find out why it failed.

Error: cabal: Failed to build documentation for zlib-0.7.1.0 (which is required by pagure-0.1.2).

Also this is certainly not a request to stop supporting 8.0: building the docs still for 8.0 is very much a corner case by now i would say, still if this is easily fixable it might be nice to address in a future release.

GHC 8.10.1 support

I have verified that zlib compiles with GHC 8.10.1. Please bump the base bound to <4.15 and release a new version or revision on Hackage.

Potential issue building on Windows

Hello,

Recently, while building cabal-install on Windows, I ran into an issue that had to do with upgrading from 0.6.3.0 to 0.7.0.0.

The saga is here: haskell/cabal#9775

To begin, pkg-config isn't available in the Windows environment used in that CI. Since 0.7.0.0 introduced a default dependency on pkg-config, I updated the Cabal project to use constraints: zlib -pkg-config. No real harm done there.

But then, I got a strange error, possibly when linking cabal-install:

Configuring executable 'cabal' for cabal-install-3.11.0.0..
Preprocessing executable 'cabal' for cabal-install-3.11.0.0..
Building executable 'cabal' for cabal-install-3.11.0.0..
[1 of 1] Compiling Main             ( main\Main.hs, C:\\GitLabRunner\builds\0\1797238\dist-newstyle\build\x86_64-windows\ghc-9.2.3\cabal-install-3.11.0.0\x\cabal\build\cabal\cabal-tmp\Main.o )
Linking C:\\GitLabRunner\\builds\\0\\1797238\\dist-newstyle\\build\\x86_64-windows\\ghc-9.2.3\\cabal-install-3.11.0.0\\x\\cabal\\build\\cabal\\cabal.exe ...
Cleaning up project directory and file based variables
ERROR: Job failed: exit status 127

When I looked into the matter, it looks like 0.6.3.0 uses the bundled C zlib by default on Windows, and 0.7.0.0 did not. By adding constraints: zlib +bundled-c-zlib on the Windows build, I was able to get a successful CI pipeline.

Now, this could simply be an issue with the CI environment, or maybe a bug with GHC (or Cabal) when trying to use the zlib that is bundled with GHC. I'm opening this just for visibility. I won't be looking into this much more myself at the present.

Decompressing concatenated file silently loses information

I discovered this when researching a user bug report at snoyberg/conduit#254. Short description:

{-# LANGUAGE OverloadedStrings #-}
import qualified Data.ByteString.Lazy as L
import Codec.Compression.GZip

main :: IO ()
main = do
    let lbs = L.concat
            [ compress "hello\n"
            , compress "world\n"
            ]
    print $ decompress lbs

This produces the output "hello\n", whereas expected behavior would be either "hello\nworld\n" or throwing an exception (though the latter would be suboptimal).

In the conduit issue, I argued against changing the behavior of conduit-extra's ungzip function to avoid breaking backwards compatibility. However, the arguments do not apply in the case of the zlib package:

  • The decompress function silently loses data right now, with no way of retrieving that data
  • The conduit-extra ungzip function leaves the unconsumed data available on the stream, which has real-world use cases where it's useful

Reconsider default value of Cabal flag `pkg-config` on Windows

zlib-0.7's zlib.cabal has (extracts):

flag pkg-config
  default:     True
  manual:      False
  description: Use @pkg-config(1)@ to locate foreign @zlib@ library.
...
if flag(bundled-c-zlib) || impl(ghcjs) || os(ghcjs) || arch(wasm32)
    ...
  else
    if flag(pkg-config)
      -- NB: pkg-config is available on windows as well when using msys2
      pkgconfig-depends: zlib
    else
      -- On Windows zlib is shipped with GHC starting from 7.10
      extra-libraries: z

However, pkg-config.exe is not available on Windows with MSYS2 'out of the box' - although it can be installed separately with pacman.

For people using Stack to build (which promises reproducible builds) on Windows, the choice of default on that operating system may be problematic.

Given that, I ask whether it would be more appropriate for the Cabal file to specify - for Windows only - a default value of false.

Upstream error when building with GHC 8.10.2, on Windows 10

I was building in a current clone of the repository with stack --resolver nightly --compiler ghc-8.10.2 build, on Windows 10 version 2004, and getting this output with an error. If it is upstream of zlib, I have not experienced it building other packages:

Selected resolver: nightly-2020-08-12
zlib> configure (lib)
Configuring zlib-0.6.2.2...
zlib> build (lib)
Preprocessing library for zlib-0.6.2.2..
Building library for zlib-0.6.2.2..
[1 of 5] Compiling Codec.Compression.Zlib.Stream
ghc.exe: could not execute: C:/GitLabRunner/builds/2WeHDSFP/0/ghc/ghc/inplace/mingw/bin/ld.exe

--  While building package zlib-0.6.2.2 (scroll up to its section to see the error) using:
      C:\sr\setup-exe-cache\x86_64-windows\Cabal-simple_Z6RU0evB_3.2.0.0_ghc-8.10.2.exe --builddir=.stack-work\dist\a3a5fe88 build lib:zlib --ghc-options " -fdiagnostics-color=always"
    Process exited with code: ExitFailure 1

The reference to C:/GitLabRunner/builds/2WeHDSFP/0/ghc/ghc/inplace/mingw/bin/ld.exe was very odd. There is no such folder. stack --resolver nightly --compiler ghc-8.10.2 exec -- where.exe ld.exe yields on my system:

Selected resolver: nightly-2020-08-12
C:\Users\mikep\AppData\Local\Programs\stack\x86_64-windows\ghc-8.10.2\mingw\bin\ld.exe
C:\msys64\mingw64\bin\ld.exe
C:\msys64\usr\bin\ld.exe

Verbose stack output adds no new information - the error log message comes immediately after the info [1 of 5] Compiling Codec.Compression.Zlib.Stream.

I then successfully built with GHC 8.10.1, with stack --resolver nightly --compiler ghc-8.10.1 build (see #27). Now, the GHC 8.10.2 build works, with:

stack --resolver nightly --compiler ghc-8.10.2 build
Selected resolver: nightly-2020-08-12
zlib> configure (lib)
Configuring zlib-0.6.2.2...
zlib> build (lib)
Preprocessing library for zlib-0.6.2.2..
Building library for zlib-0.6.2.2..
[2 of 5] Compiling Codec.Compression.Zlib.Internal
[3 of 5] Compiling Codec.Compression.Zlib.Raw
[4 of 5] Compiling Codec.Compression.Zlib
[5 of 5] Compiling Codec.Compression.GZip
zlib> copy/register
Installing library in C:\Users\mikep\Documents\Code\GitHub\zlib\.stack-work\install\ba0f85f5\lib\x86_64-windows-ghc-8.10.2\zlib-0.6.2.2-Co25ty4KuOL66sPWxfmhlW
Registering library for zlib-0.6.2.2..

GHC 8.8 issue: fail is no longer a member of Monad.

Codec/Compression/Zlib/Stream.hsc:377:3: error:
    ‘fail’ is not a (visible) method of class ‘Monad’
    |
377 |   fail   = (finalise >>) . failZ
    |   ^^^^

This currently blocks things like compiling cabal-install with ghc 8.8 for use with ghc 8.8.

no non-internal way to catch exception from decompress?

I was surprised to discover (via a production crash) that Zlib.decompress throws its own exception, not an IOError. Also, it looks like the only place its exception is exported is Codec.Compression.Zlib.Internal, which implies I shouldn't be using it normally.

How about re-exporting the exception from Zlib and GZip, and then augmenting the documentation to say exactly what exception will be thrown? As a bonus it would be nice to have an example, because I initially thought ByteString.Lazy.toStrict should be enough, but you also need an Exception.evaluate in there.

I can make a pull request if you want.

RLE is useful

098e174

comments out several useful compression modes. I'm going to convert to streaming-commons which has them to get around this, but I wanted to note the deficiency.

[s390x] testsuite fails on Fedora 36

Not sure what changed but the zlib-0.6.2.3 testsuite started failing on Fedora Linux 36 s390x:

  unit tests
    simple gzip case:                                 OK
    detect bad crc:                                   OK
    detect non-gzip:                                  FAIL
      test/Test.hs:193:
      expected: "invalid code lengths set"
       but got: "Operation-Ending-Supplemental Code is 0x27"

It is working normally elsewhere (x86_64, i686, aarch64, armv7hl, ppc64le).
https://koji.fedoraproject.org/koji/taskinfo?taskID=83592764

Downstream bug: https://bugzilla.redhat.com/show_bug.cgi?id=2045430 with a copy of the tail of the build.log

The version of the C zlib in Fedora or the package has not changed.

Support tasty-hunit-0.9

As it says "DO NOT USE YET", I leave this here, and don't make a PR

From 08fa33173fe347a028c43b0f002356ef9d6a0772 Mon Sep 17 00:00:00 2001
From: Oleg Grenrus <[email protected]>
Date: Tue, 22 Sep 2015 07:58:18 +0300
Subject: [PATCH] Support tasty-hunit-0.9

---
 test/Test.hs | 1 -
 zlib.cabal   | 5 ++---
 2 files changed, 2 insertions(+), 4 deletions(-)

diff --git a/test/Test.hs b/test/Test.hs
index 0665542..bbd8b94 100644
--- a/test/Test.hs
+++ b/test/Test.hs
@@ -11,7 +11,6 @@ import Test.Codec.Compression.Zlib.Internal ()
 import Test.Codec.Compression.Zlib.Stream ()

 import Test.QuickCheck
-import Test.HUnit
 import Test.Tasty
 import Test.Tasty.QuickCheck
 import Test.Tasty.HUnit
diff --git a/zlib.cabal b/zlib.cabal
index 6e99e45..c9300b3 100644
--- a/zlib.cabal
+++ b/zlib.cabal
@@ -80,8 +80,7 @@ test-suite tests
   default-language: Haskell2010
   build-depends:   base, bytestring, zlib,
                    QuickCheck       == 2.*,
-                   HUnit            == 1.2.*,
-                   tasty            >= 0.8 && < 0.11,
+                   tasty            >= 0.8 && < 0.12,
                    tasty-quickcheck == 0.8.*,
-                   tasty-hunit      == 0.8.*
+                   tasty-hunit      >= 0.8 && < 0.10
   ghc-options:     -Wall
-- 
2.4.2

unknown symbol `inflateReset'

I get the following build error when building a haskell program that uses the wai-app-static package. I'm not 100% sure this is the correct place to report this error but it does look like the problem lies in the zlib part which I'm guessing is used by wai-app-static.

wai-app-static > ghc-9.4.4: /home/dude/.stack/snapshots/x86_64-linux/28ecbfa8d3a82c3a295081db670ff5e7002fda74f91285cbe3eab1ed42167def/9.4.4/lib/x86_64-linux-ghc-9.4.4/zlib-0.6.3.0-GcrDIyj6sLL3AXxniz0lAg/HSzlib-0.6.3.0-GcrDIyj6sLL3AXxniz0lAg.o: unknown symbol `inflateReset'
wai-app-static > ghc-9.4.4: Could not load Object Code /home/dude/.stack/snapshots/x86_64-linux/28ecbfa8d3a82c3a295081db670ff5e7002fda74f91285cbe3eab1ed42167def/9.4.4/lib/x86_64-linux-ghc-9.4.4/zlib-0.6.3.0-GcrDIyj6sLL3AXxniz0lAg/HSzlib-0.6.3.0-GcrDIyj6sLL3AXxniz0lAg.o.
wai-app-static > <no location info>: error: unable to load unit `zlib-0.6.3.0'

A relevant piece of information here is that I am trying to link statically, using stack and musl libc on alpine linux.

Runtime failure on Windows on version 0.7.0.0

To observe the issue, do the following on Windows (having installed GHC 9.2.8+).

Have an example.cabal with the following contents:

cabal-version: 3.8
name: example
version: 0.1
executable example
  build-depends: base, zlib == 0.7.0.0
  main-is: Main.hs

Have a file Main.hs with the following contents:

module Main where
import Codec.Compression.Zlib.Raw
main = do
    putStrLn "Test"
c = compress

run cabal build, then cabal exec example. For some reason, Test is not printed to stdout. Running echo $lastexitcode shows that the exit code given is -1073741701, which from a quick google is typically related to incorrect linkings. Note that this is a runtime failure, not a build failure.

Changing the zlib version to 0.6.3.0 (which is the previous version) means that this program works.

This is probably related to Do not force bundled-c-zlib on Windows, but force it for WASM. in the previous release, if I had to guess.

This error arose when similar code was written using a library massively downstream of zlib (discord-haskell, with code as below). This is even more surprising, since I'm pretty sure that restCall shouldn't directly reference compress or similar values

module Main where

import Discord

main :: IO ()
main = do
    putStrLn "Test"

a :: (Request (r a), FromJSON a) => r a -> DiscordHandler (Either RestCallErrorCode a)
a = restCall

Other notes include is that my Windows haskell setup is entirely fresh and made specifically to test this out, so it's unlikely to be an issue with my machine (also considering that someone else brought this issue to me).

Include pure variant of decompress

A function
decompressEither :: ByteString -> Either String ByteString,
that never throws, should be included as a courtesy. Although it can be thrown together from the stream interface in Codec.Compression.Zlib.Internal, this is both not obvious and not trivial.

Support bytestring chunks bigger than 4Gb

Quoting TODO:

zlib/TODO

Lines 1 to 3 in d2d539d

* Currently will not support bytestring chunks bigger than 4Gb
because the in_avail is only a C int. So chunks bigger than that
need to be broken up into smaller ones.

TODO is 15 years old, so maybe it has already been fixed long ago, but I cannot find a regression test.

Name clash with other foreign libraries

Combining Haskell code that uses the Haskell zlib library with code that uses the grpc C++ library from Google causes a link error because the names in adler32.o are duplicated between both libraries. Is it possible to rename the vendored sources function names so this library can stand alone and not clash with anything else?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.