Code Monkey home page Code Monkey logo

rnetcdf's People

Contributors

fenclmar avatar jeroen avatar mdsumner avatar mjwoods avatar pekkarr avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

rnetcdf's Issues

netcdf4 and opendap support on Windows

Michael Sumner reports that RNetCDF on Windows is built with netcdf3. It would be useful to build with netcdf4 instead, so that netcdf4 classic files can read and modified. OPeNDAP support would also be useful.

Parallel support requires mpi.h

Problem reported by pakalee on 2022-09-13:

One minor thing with RNetCDF I found is that, if you don't have your system's MPI headers in a discoverable place, then netcdf_par.h won't compile, autoconf will report the header is not found, and you won't have parallel support compiled in. The user is unlikely to notice this until you get the "MPI is not supported" message from dataset.c. But, there's little else to indicate why. It may be a good thing to remind folks in the INSTALL/README to use CPPFLAGS to include the mpi header path. Or, perhaps a better approach would be to make parallel an explicit configure option "--with-parallel" or "--with-mpi" and then fail the configure step if netcdf_par.h can't be found or fails to compile.

Getting variable names from netCDF file

Hi I am wondering how to get the names of all variables in a netCDF file. In ncdf4 this is easy with:

nc <- nc_open("someFile.nc")
names(nc$var)

Is there an equivalent method in RNetCDF?

A NetCDF file is not readable when created with `diskless = TRUE` and `persist = TRUE`

I've encountered a problem. A NetCDF file created with diskless = TRUE and persist = TRUE is not readable.

Example 1. Everything is ok.

file1 <- "file1.nc"
nc1 <- create.nc(file1, prefill = FALSE, format = "classic")
dim.def.nc(nc1, "station", 5)
dim.def.nc(nc1, "max_string_length", 32)
var.def.nc(nc1, "name", "NC_CHAR", c("max_string_length", "station"))
myname <- c("alfa", "bravo", "charlie", "delta", "echo")
var.put.nc(nc1, "name", myname)
close.nc(nc1)

nc1 <- open.nc(file1)
close.nc(nc1)

Example 2. The only difference from Example 1 is the values of the diskless and persist arguments. The created file is not readable (the following message appears: "Error in open.nc(file2) : Invalid argument").

file2 <- "file2.nc"
nc2 <- create.nc(file2, prefill = FALSE, format = "classic",
                 diskless = TRUE, persist = TRUE)
dim.def.nc(nc2, "station", 5)
dim.def.nc(nc2, "max_string_length", 32)
var.def.nc(nc2, "name", "NC_CHAR", c("max_string_length", "station"))
myname <- c("alfa", "bravo", "charlie", "delta", "echo")
var.put.nc(nc2, "name", myname)
close.nc(nc2)

nc2 <- open.nc(file2)
close.nc(nc2)

Enum types

NetCDF4 provides an enumerated type that maps integer values to names, similar to an R factor.

The current master branch provides support for defining and inquiring about this type. Read/write support for variables and attributes of this type is yet to be added.

Allow numeric type identifiers

Type identifiers are currently required to be strings. Allow numeric identifiers, as provided by type definition and inquiry routines.

Opaque types

Netcdf4 provides an "opaque" type that can be used for blobs of data with a fixed size. These can be stored in R as arrays of raw bytes. No type conversions would need to be performed by RNetCDF.

Now all we need to do is write the code.

Unable to update RNetCDF using `update.packages`

Trying to update my R install, on Debian Bookworm, I found that RNetCDF kept on failing to install with the error

... snipped ...
make[2]: Entering directory '/tmp/Rtmp9yvDRn/R.INSTALLdbc78d45d54/RNetCDF/src'
make[2]: /usr/lib/R/share/make/shlib.mk: No such file or directory
make[2]: *** No rule to make target '/usr/lib/R/share/make/shlib.mk'.  Stop.
make[2]: Leaving directory '/tmp/Rtmp9yvDRn/R.INSTALLdbc78d45d54/RNetCDF/src'
make[1]: *** [Makefile:14: all] Error 2
make[1]: Leaving directory '/tmp/Rtmp9yvDRn/R.INSTALLdbc78d45d54/RNetCDF/src'
ERROR: compilation failed for package ‘RNetCDF’
* removing ‘/usr/lib/R/site-library/RNetCDF’
* * restoring previous ‘/usr/lib/R/site-library/RNetCDF’

It appears that this error is raised from tools/make-recursive.sh script. I was surprised by this error message because most of the other R packages install/update without any issue. So, i got curious and tried to see what's wrong with shlib.mk. And, to my complete surprise i found that shlib.mk is located at /usr/share/R/share/make/shlib.mk and not where the installation script is looking for it! I wonder if this is a problem on my end (however, i doubt it because this is a standard R install using Debian apt installation instructions on a GCP vm) or whether the installation script is somehow misconfigured.

I was able to install, by jiggering paths, RNetCDF for myself and sharing my findings in hopes that this can be helpful in making the installation/upgrade more robust. HTH!


System info

~% cat /etc/os-release 
PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
NAME="Debian GNU/Linux"               
VERSION_ID="12"                              
VERSION="12 (bookworm)"
VERSION_CODENAME=bookworm
ID=debian                                                       
HOME_URL="https://www.debian.org/"                                                                                                               
SUPPORT_URL="https://www.debian.org/support"
BUG_REPORT_URL="https://bugs.debian.org/"
~%
~%
~%
~% R --quiet --vanilla -e "sessionInfo()"
> sessionInfo()
R version 4.3.1 (2023-06-16)
Platform: x86_64-pc-linux-gnu (64-bit)
Running under: Debian GNU/Linux 12 (bookworm)

Matrix products: default
BLAS:   /usr/lib/x86_64-linux-gnu/openblas-pthread/libblas.so.3 
LAPACK: /usr/lib/x86_64-linux-gnu/openblas-pthread/libopenblasp-r0.3.21.so;  LAPACK version 3.11.0

locale:
 [1] LC_CTYPE=C.UTF-8       LC_NUMERIC=C           LC_TIME=C.UTF-8       
 [4] LC_COLLATE=C.UTF-8     LC_MONETARY=C.UTF-8    LC_MESSAGES=C.UTF-8   
 [7] LC_PAPER=C.UTF-8       LC_NAME=C              LC_ADDRESS=C          
[10] LC_TELEPHONE=C         LC_MEASUREMENT=C.UTF-8 LC_IDENTIFICATION=C   

time zone: Etc/UTC
tzcode source: system (glibc)

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

loaded via a namespace (and not attached):
[1] compiler_4.3.1
> 
> 

NetCDF: Invalid argument

I'm trying to run the following code:

  nc_file <- "https://cida.usgs.gov/thredds/dodsC/stageiv_combined"
  RNetCDF::open.nc(nc_file)
Error in RNetCDF::open.nc(nc_file) : NetCDF: Invalid argument

I'm on Windows 10, and my Mac co-workers are able to do this no problem.

When I go in the Terminal:

ncdump --version
netcdf library version 4.7.2 of Oct 22 2019 12:06:21

Can you recommend anything to allow me to use an OPeNDAP URL?

RNetCDF now failing R CMD CHECK

RNetCDF started failing yesterday on my system while running software that had worked for months. The failure is "NetCDF: Filter error: filter not defined for variable". This same error appears when running R CMD CHECK RNetCDF_2.1-1.tar.gz
check.txt

open.nc("") causes segfault

I see this on Debian, in R 3.4.3.

I'm happy to try to write a PR for this, but wanted to report it asap - it's caught me out a few times writing tests, if I get the details to system.file wrong, then you end up with an empty string.

Compiler warnings about function prototypes and sprintf

On 2023-01-16, CRAN sent me an email about new compiler warnings from R-devel, requesting they be fixed before 2023-02-11.

One warning indicated that a C function was being declared without a prototype. This was fixed by declaring the arguments void.

Another warning was about the use of sprintf, which is considered a security risk. This was fixed by using snprintf instead.

typo in create.nc help

In create.nc.Rd:

\item{share}{The buffer scheme. If \code{FALSE} (default), dataset access is buffered and cached for performance. However, if one or more processes may be reading while another process is writing the dataset, set to \code{FALSE}.}

Should the last word should be TRUE?

(Reported by Michael Sumner)

Integration of units package

RNetCDF has limited integration with udunits2 for conversion of time units in netcdf attributes. Other R packages now provide interfaces to udunits2, such as https://cran.r-project.org/web/packages/units/index.html . The RNetCDF installation process could be simplified by replacing direct links to the udunits2 library with R functions from the "units" package.

RNetCDF support for a wide range of units could be added via integration with the "units" package (or similar).

Any volunteers?

Installation fails on macOS.

My OS is Darwin Kernel Version 21.6.0: Sun Nov 6 23:31:16 PST 2022; root:xnu-8020.240.14~1/RELEASE_X86_64 x86_64.
I installed R using brew install r.
Then, I ran install.packages("RNetCDF")
Here's the error message:

** testing if installed package can be loaded from temporary location
Error: package or namespace load failed for ‘RNetCDF’ in dyn.load(file, DLLpath = DLLpath, ...):
 unable to load shared object '/usr/local/Cellar/r/4.2.2/lib/R/library/00LOCK-RNetCDF/00new/RNetCDF/libs/RNetCDF.so':
  dlopen(/usr/local/Cellar/r/4.2.2/lib/R/library/00LOCK-RNetCDF/00new/RNetCDF/libs/RNetCDF.so, 0x0006): Library not loaded: '@rpath/libexpat.1.dylib'
  Referenced from: '/usr/local/Cellar/r/4.2.2/lib/R/library/00LOCK-RNetCDF/00new/RNetCDF/libs/RNetCDF.so'
  Reason: tried: '/usr/local/Cellar/r/4.2.2/lib/R/lib/libexpat.1.dylib' (no such file)
Error: loading failed
Execution halted
ERROR: loading failed
* removing ‘/usr/local/Cellar/r/4.2.2/lib/R/library/RNetCDF’

The downloaded source packages are in
	‘/private/var/folders/ff/t85lbw0x7lgdrcxqf7sz33rr0009rr/T/RtmpSIRF36/downloaded_packages’
Updating HTML index of packages in '.Library'
Making 'packages.html' ... done
Warning message:
In install.packages("RNetCDF") :
  installation of package ‘RNetCDF’ had non-zero exit status

Version on a Mac

I would like to test version 2 on my Mac. Just trying devtools::install_github() fails because there must be flags or paths set that I don't have correctly. Any suggestions of key things to look for? I have a complete netcdf installed through Fink, I probably need to point to that correctly.

Thanks for any help and your work on this.

Search for variables based on standard_name attribute

As suggested by @lhmarsden in #102 , var.get.nc would be enhanced by adding an optional argument to search for variables based on the standard_name attribute (instead of the variable name). This would simplify the task of reading datasets created by different people, because standard_name is standardised and consistent, unlike ordinary variable names.

Improvements for time functions

Two issues reported by Zun Yin:

  1. Description of time functions is not clear
  2. Seems impossible to declare calendar used for dates

Configure error with udunits.

Just got up and running with a new Mojave laptop and am working on getting all my dependencies in line. Units installs fine with:

checking for XML_ParserCreate in -lexpat... yes
checking udunits2.h usability... yes
checking udunits2.h presence... yes
checking for udunits2.h... yes
checking for ut_read_xml in -ludunits2... yes

But installing RNetCDF I get:

checking for library containing XML_StopParser... -lexpat
checking for library containing utFree... no
checking for library containing utScan... no
configure: error: "unable to use udunits2 or udunits"

I installed udunits with brew install udunits.

R 3.6.1

Any ideas what to look for? I'm not real familiar with how these packages link up to libraries...

Show user-defined types in print.nc

Although user-defined types can be found using ncdump commands outside R, the print.nc command should allow users to examine the structure of a dataset from within R.

If possible please do not assume /usr/lib/R/share/make/shlib.mk

On Debian / Ubuntu systems since time immortal:

$ ls -l /usr/lib/R/share/make/shlib.mk
ls: cannot access '/usr/lib/R/share/make/shlib.mk': No such file or directory
$ ls -l /usr/share/R/share/make/shlib.mk
-rw-r--r-- 1 root root 661 Aug  5 16:56 /usr/share/R/share/make/shlib.mk
$ 

so the build breaks. You can make use of R.home("share") to get to it portably:

> R.home("share")
[1] "/usr/share/R/share"
> file.path(R.home("share"), "make", "shlib.mk")
[1] "/usr/share/R/share/make/shlib.mk"
> file.info(file.path(R.home("share"), "make", "shlib.mk"))
                                 size isdir mode               mtime               ctime               atime uid gid uname grname
/usr/share/R/share/make/shlib.mk  661 FALSE  644 2023-08-05 16:56:56 2023-08-05 22:05:45 2023-10-09 11:15:37   0   0  root   root
> 

Hash table for enum types

When reading enum types from netcdf, arbitrary integer values are converted to indices of the corresponding levels in the R factor. The conversion is currently using an iterative search over all known levels, which is reasonable when the number of levels is modest. However, for a large number of levels, a hash-table lookup would be faster.

R environments use a hash table implementation for symbol lookups. Could this be used to search for factor levels efficiently?

Parallel access

Allow netcdf4 files to be opened for parallel access. This requires an MPI communicator, so it depends on an MPI interface for R.

VLEN types

NetCDF4 adds VLEN types, which are similar to lists of vectors in R. A wide range of atomic types are allowed in VLEN variables, and these must be converted to/from R as required.

The low-level type conversions are available in RNetCDF. However, the higher level routines to handle VLEN types have yet to be written.

RnetCDF seems to not support compression

I encountered problem with netCDF compression. The package seems to not support it and produces always uncompressed files:

library(RNetCDF)
for (dLevel in c(NA, 1:9)) {
  nc <- create.nc(paste0("test_compression_level", dLevel, ".nc"))
  dim.def.nc(nc, "x", 1e2)
  dim.def.nc(nc, "y", 1e2)
  dim.def.nc(nc, "time", unlim=TRUE)
  var.def.nc(nc, "data", "NC_DOUBLE", c("x", "y", "time"), deflate = dLevel,
             shuffle = T)
  att.put.nc(nc, "data", "units", "NC_CHAR", "dB")
  dataNc <- array(rep(1:10, each = 1e4), dim = c(1e2, 1e2, 1e1))
  var.put.nc(nc, 'data', dataNc, start = c(1, 1, 1), count = c(1e2, 1e2, 1e1))
  close.nc(nc)
}
print(file.size(paste0("test_compression_level", c(NA, 1:9), ".nc")))

(package version 2.6.1 downloaded from CRAN, R version 4.0.0, run at windows station: Windows 10 Pro, Version 21H1)

Support slices in read.nc

The read.nc function allows start and count arguments to be passed to var.get.nc, but the same arguments are applied to all variables. Thanks to Marc Girondot for reporting this issue.

As a possible improvement, slices could be supported across all variables by specifying dimension names for elements of start and count. Any variables with one or more matching dimension names could be sliced, and all other dimensions could be read in full.

Accessing data in groups

I am trying to access data from variables within a NetCDF file that contains hierarchical groups using R. For example, see question I posted on stackoverflow: https://stackoverflow.com/questions/74612898/get-variable-data-out-of-a-group-in-a-netcdf-file-using-rnetcdf-or-ncdf4

I can't find anything about how to do this in the RNetCDF documentation - though this seems to be out of date online.
Latest version: https://www.rdocumentation.org/packages/RNetCDF/versions/2.6-1
Latest documented version: https://www.rdocumentation.org/packages/RNetCDF/versions/1.9-1

Could you help me to do this? Thanks!

Certificate issue with RNetCDF version of NetCDF-C

There's a longish discussion over here attempting to figure out certificate issues on Windows. I think I'm running into the same thing with RNetCDF.

> RNetCDF::open.nc("https://cida.usgs.gov/thredds/dodsC/prism#log&show=fetch")
#> Note:Caching=1
#> Note:fetch: https://cida.usgs.gov/thredds/dodsC/prism.dds
#> Error:curl error: SSL peer certificate or SSH remote key was not OK
#> curl error details: 
#> Warning:oc_open: Could not read url
#> Note:fetch complete: 0.163 secs
#> Error in RNetCDF::open.nc("https://cida.usgs.gov/thredds/dodsC/prism#log&show=fetch") : 
#>   NetCDF: I/O failure

My Windows and Ubuntu system installs of NetCDF-C can access the offending OPenDAP resources just fine but I do have a rather unique cert chain due to our corporate IT requirements. I'll keep trying to chase this down, but figured I'd leave a note in case others are having trouble or have figured it out?

Write to read-only

Michael Sumner reported the following problem:

I'm trying to read from thredds servers, and I get this error:

u <- "http://coastwatch.pfeg.noaa.gov/erddap/griddap/erdQSwind3day"
library(RNetCDF)
var.get.nc(open.nc(u), "altitude")
Error: NetCDF: Write to read only

ncdf4 works fine:

library(ncdf4)
ncvar_get(nc_open(u), "altitude")
[1] 10

Any ideas?

att.put.nc() does not recognize NA_real_ anymore

Behaviour of att.put.nc() seems to have changed regarding use of NA_real_ (which is the correct value for missing double precision data).

Setting a "missing_value" attribute with variable type "NC_DOUBLE" and value NA_real_ that worked properly with RNetCDF on R.3.3.2 now results in the following error : "NA values sent to netcdf without conversion to fill value"

I found out this problem when installing and testing our codes on our new computing mainframe
Linux system, compiler ICC_2018.5.274
R version : 3.6.1
netcdf-c-4.7.1 installed

Thanks for any clue !

Avoid empty vectors in output of `var.inq.nc`

Recent versions of netcdf provide an interface to the pluggable filter mechanism in hdf5, mainly to support advanced compression methods.

RNetCDF reports the filter details in the output of var.inq.nc. The filter is specified by filter_id, and arguments to the filter function are shown in filter_params. The filter_params are passed to netcdf as an integer vector, so integer(0) implies that the filter function requires no arguments.

Some existing user code does not expect empty vectors in the output from var.inq.nc. A possible solution is to translate integer(0) to NA in var.inq.nc, and vice-versa in var.def.nc.

Installing RNetCDF from rstudio. defining CPPFLAGS ?

Hi,
I'm a very low level user of R. Running on fedora 34, R 4.05. Using rstudio i am trying to install RNetCDF

install.packages("RNetCDF")
Installing package into ‘/home//R/x86_64-redhat-linux-gnu-library/4.0’
(as ‘lib’ is unspecified)
essai de l'URL 'https://cran.rstudio.com/src/contrib/RNetCDF_2.5-2.tar.gz'
Content type 'application/x-gzip' length 143675 bytes (140 KB)
==================================================
downloaded 140 KB

  • installing source package ‘RNetCDF’ ...
    ** package ‘RNetCDF’ correctement décompressé et sommes MD5 vérifiées
    ** using staged installation
    checking for gcc... gcc -m64
    checking whether the C compiler works... yes
    checking for C compiler default output file name... a.out
    checking for suffix of executables...
    checking whether we are cross compiling... no
    checking for suffix of object files... o
    checking whether the compiler supports GNU C... yes
    checking whether gcc -m64 accepts -g... yes
    checking for gcc -m64 option to enable C11 features... none needed
    checking for stdio.h... yes
    checking for stdlib.h... yes
    checking for string.h... yes
    checking for inttypes.h... yes
    checking for stdint.h... yes
    checking for strings.h... yes
    checking for sys/stat.h... yes
    checking for sys/types.h... yes
    checking for unistd.h... yes
    checking size of int... 4
    checking size of long long... 8
    checking size of size_t... 8
    checking for nc-config... no
    checking for netcdf.h... no
    configure: error: netcdf.h was not compiled - defining CPPFLAGS may help
    ERROR: configuration failed for package ‘RNetCDF’
  • removing ‘/home//R/x86_64-redhat-linux-gnu-library/4.0/RNetCDF’
    Warning in install.packages :
    installation of package ‘RNetCDF’ had non-zero exit status

The downloaded source packages are in
‘/tmp/RtmpODM1hG/downloaded_packages’`

I suppose my problem should be solved by the action described in INSTALL (R CMD INSTALL --configure-args="CPPFLAGS=-I/sw/include \ LDFLAGS=-L/sw/lib LIBS=-lhdf5" RNetCDF_2.5-2.tar.gz), but i never used r cmd install and i don't know how to pass these arguments to install.packages
Sorry for the stupid question
Bertrand

NA in start/count of var.put.nc

Christopher Barry reported the following problem:

I have found a small issue with var.put.nc . When you supply some of the start or count argument as NAs (but not all), then it corrects to dim(data) if possible. If you are, for example, updating 2D matrix slice of a 3D NetCDF array, this results in an error ("Length of start/count is not ndims") when in fact it should work fine.

The workaround for now is to read the dimension lengths in advance and specify start and count explicitly, so I am able to progress, but I thought I’d inform.

Move type checks from R to C

Each R function performs limited checking of arguments passed to C. It would be safer to perform thorough checks at C level, so that fewer assumptions are made about inputs.

Misleading type in help for utcal

Zun Yin provided the following feedback via email about the documentation of utcal:

Thanks a lot for the improvement of RNetCDF. The document is much clearer and easy to follow. Sorry for late feedback as I don't use it to create netcdf file very often. Now I just find a small error in the manual. The format of time dimension should be NC_DOUBLE, not NC_INT (e.g. Page 24 when you define the variable of time). Otherwise, R will only record the integer and lead to a wrong time. Thanks again for your effort. I benefit a lot from your work :)

Move packing/unpacking from R to C

Use function R_nc_miss_att in variable.c to read packing attributes and pass to conversion routines. Remove packing/unpacking from var.get.nc and var.put.nc.

Build with MinGW NetCDF on Windows

In R version 4, Windows builds use a toolset based on MinGW. This should allow a newer version of the NetCDF library to be used, possibly fixing reported problems with large files and OpenDAP.

Error in checkForRemoteErrors(val) : 2 nodes produced errors; first error: NetCDF: DAP failure

I'm running a package (cft) that relies on some functions from the RNetCDF package, specifically the read.nc function and am running into errors. In addition to the error above, I have also gotten the error Error in try(.Call(R_nc_inq_grp_parent, ncid), silent = TRUE) : NetCDF: No group found. and no non-missing arguments to min; returning Infno non-missing arguments to max; returning -Inf

I am unsure of why this is happening, but it originally seemed to be in relation to R version 4.1, but I am now getting these errors using R version 4.0.5.

Chunked variables

Add options for chunking in variable definitions. Also consider compression and checksumming.

find NetCDF lib version?

Is there any way to find the library version with RNetCDF functions?

I'm seeing failures on Ubuntu Trusty (14.04) with version 4.1.3 when attempting to open the Unidata file test_hgroups.nc, it works fine on Xenial (16.04) with version 4.4.0. I'm not sure ultimately what the problem is (groups?) but having the version would give something to test for R code.

OpenDAP access for password protected files

OpenDAP is a great way to remotely access large NetCDF datasets.

The current version of RNetCDF works with OpenDAP links for un-authenticated URLs. For example, for GHCN CAMS data from the IRI Data Library, this just works:

library(RNetCDF)
# Example URL to IRI Data Library GHCN_CAMS data 
url <- "http://iridl.ldeo.columbia.edu/SOURCES/.NOAA/.NCEP/.CPC/.GHCN_CAMS/.gridded/.deg0p5/.temp/dods"
nc <- open.nc(url)
print.nc(nc)
close.nc(nc)

However, some URLs, including those of the UCAR Research Data Archive (RDA) or the NASA GES DISC are password protected. Currently, my approach for accessing these services is to manually build the subset URL, then download the resulting NetCDF file and read it from disc. For example, for some MERRA outputs:

# Base URL for MERRA at this date
url_base <- "https://goldsmr4.gesdisc.eosdis.nasa.gov:443/opendap/MERRA2/M2I1NXASM.5.12.4/1983/08/MERRA2_100.inst1_2d_asm_Nx.19830801.nc4"
# Add NCDF4 output tail (not a typo -- the ending is `.nc4.nc4`)
url_nc4 <- paste0(url_base, ".nc4")
# Add the subset query for wind U component (variable U2M)
url <- paste0(url_nc4, "?U2M[0:1:23][0:1:1][0:1:1]")

tmp <- tempfile()

library(RNetCDF)
library(curl)
library(magrittr)

h <- new_handle() %>%
  handle_setopt(
    followlocation = TRUE,
    username = my_user,
    password = my_pass
  )

curl_download(url, tmp, handle = h)

nc <- open.nc(tmp)
print.nc(nc)
close.nc(nc)

...or, using the crul package...

library(crul)
http <- HttpClient$new(
  url = url,
  auth = auth(user = my_user, pwd = my_pass)
)
result <- http$get()
tmp2 <- tempfile()
writeBin(result$content, tmp2)
nc <- RNetCDF::open.nc(tmp2)

However, it would be great if there was a way to more directly access the NetCDF file and subset it through R in a way analogous to the non-password-protected services like the first example.

The underlying issue here is that RNetCDF::open.nc only supports simple URLs as connections. However, if it could be modified to work with full HTTP requests generated by curl or crul, I think everything else should work the same.

NC_STRING

Arne Melsom requests support for files with NC_STRING variables. This is a netcdf4 feature, and therefore not supported by current versions of RNetCDF.

udunits2.h not checked/not found (openSUSE)

On openSUSE Tumbleweed, RNetCDF does not find the udunits header file. Configure log shows:

checking for library containing utFree... -ludunits2
checking udunits2/udunits.h usability... no
checking udunits2/udunits.h presence... no
checking for udunits2/udunits.h... no
checking udunits.h usability... no
checking udunits.h presence... no
checking for udunits.h... no
configure: error: "unable to use udunits2 or udunits"

On openSUSE it is available from the udunits2-devel package as /usr/include/udunits2.h. It looks like the configure script simply doesn't check for any header files with a 2 in it.

This is not an issue for the udunits2 R package, that one installs fine, so a solution for this issue would also be to implement #12.

Improve support for nested VLEN types

The netcdf library provides a function nc_free_vlen, which frees memory allocated by the library for storing VLEN types. In principle, nested VLEN types are possible, but nc_free_vlen only frees the top level. To avoid memory leaks, the current version of the netcdf library has added a new function nc_reclaim_data and marked nc_free_vlen as deprecated. The new function should be used in RNetCDF if available in the netcdf library.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.