Code Monkey home page Code Monkey logo

icesat2r's Introduction

Binder tic docs: passing CRAN_Status_Badge Downloads Dependencies codecov.io

IceSat2R


Programmatic connection to the OpenAltimetry EARTHDATA API to download and process the following ICESat-2 Altimeter Data,

  • 'ATL03' (Global Geolocated Photon Data)
  • 'ATL06' (Land Ice Height)
  • 'ATL07' (Sea Ice Height)
  • 'ATL08' (Land and Vegetation Height)
  • 'ATL10' (Sea Ice Freeboard)
  • 'ATL12' (Ocean Surface Height)
  • 'ATL13' (Inland Water Surface Height)

The user has the option to download the data by selecting a bounding box from a 1- or 5-degree grid globally utilizing a shiny application. The Documentation, the two package Vignettes (first, second) and the blog post explain the functionality in detail.


The ICESat-2 mission collects altimetry data of the Earth's surface. The sole instrument on ICESat-2 is the Advanced Topographic Laser Altimeter System (ATLAS) instrument that measures ice sheet elevation change and sea ice thickness, while also generating an estimate of global vegetation biomass.

ICESat-2 continues the important observations of

  • ice-sheet elevation change
  • sea-ice freeboard, and
  • vegetation canopy height

begun by ICESat in 2003.


System Requirements

The usage of the IceSat2R package requires a geospatial setup as specified in the sf or terra README.md files.


How the IceSat2R package can be used?


The IceSat2R package includes the code, documentation, and examples so that,

  • A user can select an area of interest (AOI) either programmatically or interactively
  • If the Reference Ground Track (RGT) is not known, the user has the option to utilize either
    • one of the "overall_mission_orbits()" or "time_specific_orbits()" to compute the RGT(s) for a pre-specified global area or for a time period, or
    • one of the "vsi_nominal_orbits_wkt()" or "vsi_time_specific_orbits_wkt()" to compute the RGT(s) for a specific AOI
  • Once the RGT is computed it can be verified with the "getTracks()" function of the OpenAltimetry EARTHDATA Web API
  • Finally the user can utilize one of the "get_atlas_data()" or "get_level3a_data()" functions to retrieve the data for specific product(s), Date(s) and Beam(s)

This work-flow is illustrated also in the following diagram,



Shiny application to select an area of interest (AOI) from a 1- or 5-degree global grid


The OpenAltimetry EARTHDATA API restricts the requests to a 1x1 or 5x5 degree spatial bounding box, unless the "sampling" parameter is set to TRUE. The shiny application of the IceSat2R package allows the user to create a spatial grid of an AOI, preferably a 1- or 5-degree grid so that the selection can be within limits. An alternative would be to create a grid of smaller grid cells than required (for instance a 4-degree grid) and then to select multiple grid cells,



Example Use Case-1: 3-Dimensional Line Plot by combining ICESat-2 and Copernicus DEM (Digital Elevation Model) Data


The following 3-dimensional interactive line plot (which appears in the 'IceSat-2_Atlas_products' Vignette) shows,

  • in blue color the elevation based on the DEM compared to the two ICESat-2 beams ('gt1r' and 'gt2l'), as these are separated by a 3-km distance
  • in orange color the land-ice-height measurements of the summer period (separately for 'gt1r' and 'gt2l')
  • in green color the land-ice-height measurements of the winter period (separately for 'gt1r' and 'gt2l')


Example Use Case-2: Multi-Plot displaying the Ice, Land, Canopy and Copernicus DEM (30-meter) of each beam separately for a specific ICESat-2 Track and area of interest (Himalayas mountain range)



Binder

The user of the IceSat2R R package can reproduce the examples of the documentation using the available binder Rstudio image. Once launched the cloud instance will take a few minutes to be ready. You can read more about binder on the web. In short, binder allows to make "your code immediately reproducible by anyone, anywhere". Limitations:


Docker Image


Docker images of the IceSat2R package are available to download from my dockerhub account. The images come with Rstudio and the R-development version (latest) installed. The whole process was tested on Ubuntu 18.04. To pull & run the image do the following,


docker pull mlampros/icesat2r:rstudiodev

docker run -d --name rstudio_dev -e USER=rstudio -e PASSWORD=give_here_your_password --rm -p 8787:8787 mlampros/icesat2r:rstudiodev

The user can also bind a home directory / folder to the image to use its files by specifying the -v command,


docker run -d --name rstudio_dev -e USER=rstudio -e PASSWORD=give_here_your_password --rm -p 8787:8787 -v /home/YOUR_DIR:/home/rstudio/YOUR_DIR mlampros/icesat2r:rstudiodev


The USER defaults to rstudio but you have to give your PASSWORD of preference (see https://rocker-project.org/ for more information).


Open your web-browser and depending where the docker image was build / run give,


1st. Option on your personal computer,


http://0.0.0.0:8787 

2nd. Option on a cloud instance,


http://Public DNS:8787

to access the Rstudio console in order to give your username and password.


Installation:


To install the package from CRAN use,

install.packages("IceSat2R")

and to download the latest version of the package from Github,

remotes::install_github('mlampros/IceSat2R')

R package tests:


To execute the package tests (all or a specific file) use the following code snippet:

# first download the latest version of the package

url_pkg = 'https://github.com/mlampros/IceSat2R/archive/refs/heads/master.zip'
temp_pkg_file = tempfile(fileext = '.zip')
print(temp_pkg_file)

download.file(url = url_pkg, destfile = temp_pkg_file, quiet = TRUE)

dir_pkg_save = dirname(temp_pkg_file)
utils::unzip(zipfile = temp_pkg_file, exdir = dir_pkg_save, junkpaths = FALSE)

# build and install the latest version of the package

require(glue)

setwd(dir_pkg_save)
system('R CMD build --compact-vignettes="gs+qpdf" --resave-data IceSat2R-master')
gz_file = which(gregexpr(pattern = "^IceSat2R+_+[0-9]+.+[0-9]+.+[0-9]+.tar.gz", text = list.files()) != -1)
system(glue::glue("R CMD INSTALL {list.files()[gz_file]}"))

# load the package

require(IceSat2R)

# run all tests

testthat::test_local(path = file.path(dirname(temp_pkg_file), 'IceSat2R-master'),
                     reporter = testthat::default_reporter())

# run a specific test file from the 'testthat' directory of the package 
# https://github.com/mlampros/IceSat2R/tree/master/tests/testthat

test_specific_file = file.path(dirname(temp_pkg_file), 
                               'IceSat2R-master', 
                               'tests', 
                               'testthat', 
                               'test-mission_orbits.R')

Sys.setenv(NOT_CRAN = "true")       # run all tests (including the ones skipped on CRAN)
testthat::test_file(path = test_specific_file, reporter = testthat::default_reporter())
Sys.unsetenv("NOT_CRAN")            # unset the previously modified environment variable

The previous code snippet allows a user to test if the package works as expected in any Operating System.


The Beam Pattern


The ATLAS beam pattern on the ground changes depending on the orientation of the ICESat-2 observatory. The pattern on top (of the following Figure) corresponds to traveling in the forward (+x) orientation, while the pattern on the bottom corresponds to traveling in the backward (-x) orientation. The numbers indicate the corresponding ATLAS beam, while the L/R mapping are used on the ATL03 and higher-level data products. The two strong beams with the TEP are ATLAS beams 1 and 3 (Fig.8, Neumann et al., 2019, https://doi.org/10.1016/j.rse.2019.111325)



Using a table to map the strong and weak beams (Reference: sliderule-python documentation)


ATLAS oriented forward (+x)

ATLAS Spot Number Ground track Designation Beam Strength
1 gt3r Strong
2 gt3l Weak
3 gt2r Strong
4 gt2l Weak
5 gt1r Strong
6 gt1l Weak

ATLAS oriented backwards (-x)

ATLAS Spot Number Ground track Designation Beam Strength
1 gt3r Weak
2 gt3l Strong
3 gt2r Weak
4 gt2l Strong
5 gt1r Weak
6 gt1l Strong

Citation:


If you use the code of this repository in your paper or research please cite both IceSat2R and the original articles (see CITATION) https://CRAN.R-project.org/package=IceSat2R:


@Manual{,
  title = {{IceSat2R}: ICESat-2 Altimeter Data using R},
  author = {Lampros Mouselimis},
  year = {2022},
  note = {R package version 1.0.4},
  url = {https://CRAN.R-project.org/package=IceSat2R},
}

Code of Conduct

Please note that the IceSat2R project is released with a Contributor Code of Conduct. By contributing to this project, you agree to abide by its terms.

Acknowledgment

This project received financial support from the


icesat2r's People

Contributors

actions-user avatar arfon avatar github-actions[bot] avatar mlampros avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

Forkers

whigg

icesat2r's Issues

The OpenAltimetry API does not return the strong and weak beams for the ATL03 product

In many scientific papers the authors separate between strong and weak beams (data observations) mainly because strong beams give better results (for specific scientific areas).
The OpenAltimetry API does not return the strong and weak beams for the ATL03 product (raw photons). For instance, the ATL08 product includes this information in the output .json file but not in the .csv file. However, this is not the case for the ATL03 product where neither the .json nor the .csv output file includes any information regarding the strong and weak beams.
I've included information regarding the beam pattern in the README.md file and I also sent an e-mail to the OpenAltimetry support (at 28th May 2022) and I'm waiting for the response. I've attached here a .zip file that includes two .RDS files,

  • atl03_beams_strong_weak_JSON.RDS
  • atl03_beams_strong_weak_CSV.RDS

The following code snippet allows to reproduce the two attached files
atl03_beams_strong_weak.zip

require(IceSat2R)

#...........................
# Sample dates for RGT cycle
#...........................

approx_date_start = "2021-02-01"
approx_date_end = "2021-02-15"

res_rgt_many = time_specific_orbits(date_from = approx_date_start,
                                    date_to = approx_date_end,
                                    RGT_cycle = NULL,
                                    download_method = 'curl',
                                    threads = parallel::detectCores(),
                                    verbose = TRUE)

#.....................................................
# then we create the bounding box for a sample area
#.....................................................

plg = "POLYGON ((140 -6.641235, 145 -6.641235, 145 -1.641235, 140 -1.641235, 140 -6.641235))"
sf_obj_bbx = sf::st_as_sfc(plg, crs = 4326)
bbx_aoi = sf::st_bbox(obj = sf_obj_bbx)

# global grid of 1-degree

gl_grid_1_d = degrees_to_global_grid(minx = as.numeric(bbx_aoi['xmin']),
                                     maxx = as.numeric(bbx_aoi['xmax']),
                                     maxy = as.numeric(bbx_aoi['ymax']),
                                     miny = as.numeric(bbx_aoi['ymin']),
                                     degrees = 0.9,
                                     verbose = TRUE)

res_inters = sf::st_intersects(x = gl_grid_1_d,
                               y = sf::st_geometry(res_rgt_many),
                               sparse = TRUE)
#.....................
# matched (RGT) tracks
#.....................

df_inters = data.frame(res_inters)
rgt_subs = res_rgt_many[df_inters$col.id, , drop = FALSE]
rgt_subs

#.............
# verify RGT's
#.............

dtbl_rgts = verify_RGTs(nsidc_rgts = rgt_subs,
                        bbx_aoi = bbx_aoi,
                        verbose = TRUE)

#........................................
# for-loop to iterate over the parameters
#........................................

dates_iters = unique(dtbl_rgts$Date_time)
RGTs_iters = unique(dtbl_rgts$RGT_NSIDC)
BEAMS = c( 'gt1l', 'gt1r', 'gt2l', 'gt2r', 'gt3l', 'gt3r')

dat_out_csv = dat_out_json = logs_out = list()

for (idx in seq_along(dates_iters)) {

  date_i = dates_iters[idx]
  rgt_i = RGTs_iters[idx]

  for (beam_item in BEAMS) {
    for (ROW_ID in df_inters$row.id) {

      name_iter = glue::glue("{date_i}_{rgt_i}_{beam_item}_{ROW_ID}")
      cat(glue::glue("Date: '{date_i}'  RGT: '{rgt_i}'  Beam: '{beam_item}'  Global-Grid-ID: '{ROW_ID}'"), '\n')

      bbx_gl_grid = gl_grid_1_d[ROW_ID, , drop = F]
      bbx_gl_grid = sf::st_bbox(obj = bbx_gl_grid)

      iter_dat = get_atlas_data(minx = as.numeric(bbx_gl_grid['xmin']),
                                miny = as.numeric(bbx_gl_grid['ymin']),
                                maxx = as.numeric(bbx_gl_grid['xmax']),
                                maxy = as.numeric(bbx_gl_grid['ymax']),
                                date = date_i,
                                trackId = rgt_i,
                                beamName = beam_item,
                                product = 'atl03',
                                client = 'portal',
                                outputFormat = 'csv',
                                verbose = FALSE)

      iter_dat_json = get_atlas_data(minx = as.numeric(bbx_gl_grid['xmin']),
                                     miny = as.numeric(bbx_gl_grid['ymin']),
                                     maxx = as.numeric(bbx_gl_grid['xmax']),
                                     maxy = as.numeric(bbx_gl_grid['ymax']),
                                     date = date_i,
                                     trackId = rgt_i,
                                     beamName = beam_item,
                                     product = 'atl03',
                                     client = 'portal',
                                     outputFormat = 'json',
                                     verbose = FALSE)

      iter_logs = list(Date = date_i,
                       RGT = rgt_i,
                       Beam = beam_item,
                       N_rows = nrow(iter_dat))

      logs_out[[name_iter]] = data.table::setDT(iter_logs)
      dat_out_csv[[name_iter]] = iter_dat
      dat_out_json[[name_iter]] = iter_dat_json
    }
  }
}

dat_out_csv_df = data.table::rbindlist(dat_out_csv)

saveRDS(object = dat_out_csv_df, file = 'atl03_beams_strong_weak_CSV.RDS')
saveRDS(object = dat_out_json, file = 'atl03_beams_strong_weak_JSON.RDS')

1 output sublist did not have a valid 'description' and will be removed!

I tried again the first vignette with the correct locale. I got:

rgt_winter = time_specific_orbits(date_from = start_date_w,
                                  date_to = end_date_w,
                                  RGT_cycle = NULL,
                                  download_method = 'curl',
                                  threads = parallel::detectCores(),
                                  verbose = TRUE)
#> The available Icesat-2 orbits will be red from 'https://icesat-2.gsfc.nasa.gov/science/specs' ... 
#> Access the data of the technical specs website ...
#> Extract the .zip files and the corresponding titles ...
#> Keep the relevant data from the url's and titles ...
#> Process the nominal and time specific orbits separately ...
#> Adjust the Dates of the time specific orbits ...
#> Create the nominal orbits data.table ...
#> Create the time specific orbits data.table ...
#> Return a single data.table ...
#> Elapsed time: 0 hours and 0 minutes and 0 seconds. 
#> ICESAT-2 orbits: 'Earliest-Date' is '2018-10-13'  'Latest-Date' is '2022-09-20' 
#> -----------------------------------------------------
#> The .zip file of 'RGT_cycle_9' will be downloaded ... 
#> -----------------------------------------------------
#>   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
#>                                  Dload  Upload   Total   Spent    Left  Speed
#> 100  140M  100  140M    0     0  1201k      0  0:01:59  0:01:59 --:--:-- 1340k
#> The downloaded .zip file will be extracted in the '/tmp/RtmphQC1V2/RGT_cycle_9' directory ... 
#> Download and unzip the RGT_cycle_9 .zip file: Elapsed time: 0 hours and 2 minutes and 2 seconds. 
#> 138 .kml files will be processed ... 
#> Parallel processing of 138 .kml files using  4  threads starts ... 
#> 1 output sublist did not have a valid 'description' and will be removed!
#> 1 output sublist did not have a valid 'description' and will be removed!
#> 1 output sublist did not have a valid 'description' and will be removed!
#> 1 output sublist did not have a valid 'description' and will be removed!
#> 1 output sublist did not have a valid 'description' and will be removed!
#>  138 times the same message

I got again and again and again the same message for each successive download. I got 824 times the same message for RGT_cycle_9 and so on. At the end rgt_winter looks legit

OpenAltimetry Version 2.0

Many functions in the IceSat2R package no longer work as the OpenAltimetry website (previously https://openaltimetry.org/) appears to have migrated to https://openaltimetry.earthdatacloud.nasa.gov/ in their new Version 2.0 release (released October 5th, see [https://nsidc.org/openaltimetry/known-issues-updates#anchor-0]. The URLs accessed in most functions no longer load.

For example, when attempting to use getTracks(), which accesses openaltimetry.org/data/api/icesat2/getTracks....., the function fails with the following error: "Timeout was reached: [openaltimetry.org] Connection timeout after 10005 ms"

Will there be any updates to keep IceSat2R working with the new OpenAltimetry version 2.0?

Thank you!

1 output sublist did not have a valid 'description' and will be removed!

I tried to run the second vignette. When i tried to create the rgt_winter, i got the mistake for 138 times. Furthermore i got this error: Error: It seems that after splitting the observations by empty space the number of columns (per row) are not an even number!

The output object looks like this: Simple feature collection with 8645 features and 5 fields

T tried to run the testthat function, you described, the first test passed. With the second test i got this error:
── Warning (Line 8): the function 'vsi_time_specific_orbits_wkt()' returns the expected output for an input WKT! ──
The 'sf' gdalinfo returned an empty character string (or vector)! Attempt to read the url using the OS configured 'gdalinfo' function! (As an alternative you can set the parameter 'download_zip' to TRUE and re-run the function)
Backtrace:

  1. IceSat2R::vsi_time_specific_orbits_wkt(...)
  2. IceSat2R::vsi_kml_from_zip(...)

── Error (Line 8): the function 'vsi_time_specific_orbits_wkt()' returns the expected output for an input WKT! ──
Error: The OS 'gdalinfo' gave the following error: '"gdalinfo"' not found!
Backtrace:

  1. IceSat2R::vsi_time_specific_orbits_wkt(...)
  2. IceSat2R::vsi_kml_from_zip(...)

When i continue trough the script, R culd not find the ne_10m_glaciated_areas.RDS:
ne_glaciers = system.file('data_files', 'ne_10m_glaciated_areas.RDS', package = "IceSat2R", mustWork = TRUE)
Error in system.file("data_files", "ne_10m_glaciated_areas.RDS", package = "IceSat2R", :
keine Datei gefunden

Can you help me to solve this problem?
Thank you very much

e-mail received from NSIDC User Services

Dear Colleague,
 
On April 4, 2022, at approximately 14:12 UTC, the ICESat-2 satellite experienced an anomaly and entered into
Safe Mode.  
No science data are being recorded at the current time.  All data sets will experience a gap in temporal 
coverage.  
The expedited latency “quick look” data sets, which are part of the Land, Atmosphere Near real-time Capability
for EOS (LANCE) program, are immediately impacted. 
 
NSIDC will provide an update once the problem has been resolved. 
 
Best regards,
 
NSIDC DAAC
User Services

Consistent naming, consider changing from IceSat2 to ICESat-2

Your package is called IceSat2R, but the official name of the mission is ICESat-2. Note the capitals and dash. I'm not sure whether the package can be renamed (ICESat2R?), but I think the documentation could use consistent naming of ICESat-2 in the titles.

update the IceSat-2 R package (OpenAltimetry --> EarthData)

I'll add information in this issue as I update the IceSat-2 R package after the migration of the OpenAltimetry to EarthData

  • available_nominal_orbits (Nominal mission orbits)
  • available_RGTs (Reference Ground Tracks (RGTs))
  • degrees_to_global_grid (Create a global grid based on degrees)
  • getTracks (Get the ICESAT-2 Tracks)
  • get_atlas_data ("atl03" is not yet functional but 'atl06', 'atl07', 'atl08', 'atl10', 'atl12' or 'atl13' work as expected )
  • get_level3a_data(Get IceSat-2 ATLAS 'Level-3A' data for a time interval)
  • latest_orbits (Extraction of the url from the Technical Specification Website)
  • ne_10m_glaciated_areas (Natural Earth 10m Glaciated Areas (1:10 million scale))
  • overall_mission_orbits (Overall Mission Orbits)
  • revisit_time_RGTs (Revisit Time Reference Ground Tracks and Dates)
  • RGT_cycle_14 Reference (Ground Tracks (RGTs) for IceSat-2 Cycle 14)
  • select_aoi_global_grid (R6 Class to Select an Area of Interest (AOI) from a Global Grid)
  • time_specific_orbits (Time Specific Orbits)
  • verify_RGTs (Verification of the Reference Ground Tracks (RGTs))
  • vsi_kml_from_zip (Utilizing Virtual File Systems (vsi) to extract the .kml from the .zip file)
  • vsi_nominal_orbits_wkt (Utilizing Virtual File Systems (vsi) and Well Known Text (WKT) to access the 'nominal orbits')
  • vsi_time_specific_orbits_wkt (Utilizing Virtual File Systems (vsi) and Well Known Text (WKT) to access the 'time specific orbits')

This issue is a continuation of #13

Bug in `time_specific_orbits`: `Class attribute on column 3 of item 96 does not match with column 3 of item 1.`

From the documentation here, I ran

> start_date_w = "2020-12-15"
> end_date_w = "2021-02-15"
> 
> rgt_winter = time_specific_orbits(date_from = start_date_w,
+                                   date_to = end_date_w,
+                                   RGT_cycle = NULL,
+                                   download_method = 'curl',
+                                   threads = parallel::detectCores(),
+                                   verbose = TRUE)
The available Icesat-2 orbits will be red from 'https://icesat-2.gsfc.nasa.gov/science/specs' ... 
Access the data of the technical specs website ...
Extract the .zip files and the corresponding titles ...
Keep the relevant data from the url's and titles ...
Process the nominal and time specific orbits separately ...
Adjust the Dates of the time specific orbits ...
Create the nominal orbits data.table ...
Create the time specific orbits data.table ...
Return a single data.table ...
Elapsed time: 0 hours and 0 minutes and 0 seconds. 
ICESAT-2 orbits: 'Earliest-Date' is '2018-10-13'  'Latest-Date' is '2022-09-20' 
-----------------------------------------------------
The .zip file of 'RGT_cycle_9' will be downloaded ... 
-----------------------------------------------------
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  140M  100  140M    0     0  5630k      0  0:00:25  0:00:25 --:--:-- 2878k
The downloaded .zip file will be extracted in the '/var/folders/w_/152mmjvx4cx7y55ky886tp1h0000gn/T//RtmpukLAeZ/RGT_cycle_9' directory ... 
Download and unzip the RGT_cycle_9 .zip file: Elapsed time: 0 hours and 0 minutes and 27 seconds. 
138 .kml files will be processed ... 
Parallel processing of 138 .kml files using  10  threads starts ... 
Error in data.table::rbindlist(inner_obj) : 
  Class attribute on column 3 of item 96 does not match with column 3 of item 1.

Suggestion of simplifcation for the vignettes and article

As a user, as a beginner, as a person who do not known IceSat let me suggest you what kind of content I'd like to see in the vignette 1. This suggestion is related to my review where I said that the documentation and vignettes are tedious to read and overly complex. I suggest to create higher level functions and respect common data storage practice such as the vignette 1 looks like that.

require(IceSat2R)
require(sf)
sf::sf_use_s2(use_s2 = FALSE)   

data(greenl_sh_east)
bb = st_bbox(greenl_sh_east)

rgt_winter = time_specific_orbits("2020-12-15","2021-02-15", bbox = bb, threads = 4, verbose = TRUE)
rgt_summer = time_specific_orbits("2021-06-15","2021-08-15", bbox = bb, threads = 4, verbose = TRUE)

7 lines and we are already somewhere between line 35 and 45.

Then there is the "unique RGT" stuff. You may keep it as is or include a function in your package to do it. It looks standard and repetitive so it should go in the package in my opinion. No need to print stuff. Beginner are able to use length(unq_rgt_winter) themselves. 2 lines instead of 15

unq_rgt_winter = clean_duplicates(rgt_subs_winter)
unq_rgt_summer = clean_duplicates(rgt_subs_summer)

Then we have the definitionof verify_RGTs. This should be a function of the package. We gain another 40 lines of code.

Then we have twice a complex set of code around verify_RGTs. It becomes 2 lines of code because verify_RGTs already set the correct column names and is internally able to deal with standard data.

ver_trc_winter = verify_RGTs(rgt_winter)
ver_trc_summer = verify_RGTs(rgt_summer)

Then we have

greenl_grid = degrees_to_global_grid(greenl_sh_east,  degrees = 4.5, verbose = TRUE)

And I stopped here. The first half of the vignette 1 can be contained in 12 lines with a careful design. Nobody want to go into lines and lines and lines of tedious code in the first learning step 😉

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.