U.S. Geological Survey (USGS) National Seismic Hazard Mapping Project (NSHMP) web service code.
Please see the wiki for more information.
This repository has been moved to GitLab: https://code.usgs.gov/ghsc/nshmp/nshmp-haz.
National Seismic Hazard Mapping Project (NSHMP) Web Service Code
License: Other
U.S. Geological Survey (USGS) National Seismic Hazard Mapping Project (NSHMP) web service code.
Please see the wiki for more information.
This repository has been moved to GitLab: https://code.usgs.gov/ghsc/nshmp/nshmp-haz.
When calling hazard service first time, a circular reference between Metadata and the Edition enum throws an exception. Oddly, a usage request does not, and subsequent calculation requests are processed.
Version information should be moved out to it's own class.
The webapp directory is filling up and will continue to do so. We might consider moving all html files and supporting folders down into an apps directory. There would only need to be index.html in the web app directory. Alternatively it may be possible to specify:
<welcome-file-list>
<welcome-file>apps/index.html</welcome-file>
</welcome-file-list>
and keep all html in the apps directory, but I'm not sure if the above will work.
Should we handle SiteClass instead of Vs30, or just both (parse number --> parse enum)?
Sanaz's updated values (adapted from Kircher):
1080 760 530 365 260 185 (m/s)
Hazard in the CEUS-WUS overap zone is calculated using the configs of the two models. However, when merging the models, the combined model inherits the config of the first supplied model. This means that for deagg, the incorrect exceedance model may be used (e.g. TRUNCATION_3SIGMA_UPPER vs NSHM_CEUS_MAX_INTENSITY).
However, the CEUS exceedance model generally applies 3ฯ truncation everywhere away from New Madrid so it is okay to use TRUNCATION_3SIGMA_UPPER in the overlap zone. This requires that when merging the WUS model is listed first.
Numerous changes to nshmp-haz, including the addition of deaggregation, have left the ws repository out of sync.
Completion of this issue requires deagg JSON output.
The latest expansion of the hazard web service to support more periods works well, but, like the static services (UHT usgs/earthquake-hazard-tool#272), selection of vs760, west of -115, causes web service requests to specify COUS, which in turn restricts the number of spectral periods supported when processing the imt=any
argument.
SpectraService does not currently return PGA values. Some GMMs support 0.01s, the coefficients for which are often the same as those for PGA. For the time being , return PGA as 0.001s.
Use metadata constraints to highlight (in red) fields for which invalid values have been entered.
Per recent discussion, add the ability to override static and dynamic server settings (and perhaps other settings in the future) via a config.json
file. File would also be added to .gitignore
.
...and not returned as JSON
Lacking this the web application will not work when requested using HTTPS.
Example:
https://earthquake.usgs.gov/nshmp-haz-ws/hazard
Expected Result:
A "syntax" value beginning with "https://..."
Actual Result:
A "syntax" value beginning with "http://..."
Possible Resolution:
The code can probably check the "X_FORWARDED_PROTO" header to see how the original request was made and then mirror it in its response.
If the web application determines the model should be COUS, the web service seems not to know how to handle it...
java.lang.IllegalArgumentException: No enum constant gov.usgs.earthquake.nshm.www.services.Model.COUS_2008
at java.lang.Enum.valueOf(Enum.java:236)
at gov.usgs.earthquake.nshm.www.services.Model.valueOf(Model.java:10)
at gov.usgs.earthquake.nshm.www.services.HazardCurve.processCalculation(HazardCurve.java:140)
at gov.usgs.earthquake.nshm.www.services.HazardCurve.doGet(HazardCurve.java:97)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:622)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:291)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.filters.CorsFilter.handleSimpleCORS(CorsFilter.java:301)
at org.apache.catalina.filters.CorsFilter.doFilter(CorsFilter.java:165)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:142)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:617)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:518)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1091)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:668)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1527)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1484)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
Not sure about best method to resolve this since there is no "COUS_*" model repo.
On main tomcat page for webapp... http://host:port/nshmp-haz-ws/
Returns HTML describing the "HazardCurve" endpoint. This has been renamed to "hazard". Also add documentation for deaggregation.
The "syntax" usage described on the endpoint documentation page also suggests "HazardCurve", which breaks. Similarly, the syntax usage described on the "deagg" endpoint seems to refer to "Deagg" instead. This should all be consistent.
Currently the '/any/' url fragment is reading the IMTs to run from the defaults specified with the models.
COUS: PGA, 0.1s, 0.2s, 0.3s, 0.5s, 0.75s, 1.0s, 2.0s, 3.0s, 5.0s
(superset supported by WUS and CEUS models)
AK: PGA, 0.1s, 0.2s, 0.3s, 0.5s, 1.0s, 2.0s
Standard deviations don't compare well in log space. Is it possible to decouple y-axis scaling in the sigma plot from the ground motion plot.
Depends on #158
Rename:
Add panel to dashboard pointing to the services documentation page.
Consider if/how to merge other application documentation.
Update header (and footer?) on index page to match
Filter out services and htcontent for internal use only. May require creating internal package or directories.
Test location in UHT: 48, -117, 1s; contributions, albeit negligible, are present for CEUS GMM
Create a plot that uses D3 satellite projection to:
SINGLE
contributors from deagg service JSON as vertical bars; height = contributionCurrently when deploying a packed WAR file to dev and prod systems, a zip error occurs (see trace below). It's not clear what changed to cause this to start occurring.
One possible solution is to not pack the models in zip files during a build (ant: zipfileset); there's no real benefit to doing so.
com.google.common.util.concurrent.UncheckedExecutionException: com.google.gson.JsonIOException: java.util.zip.ZipException: invalid stored block lengths (see logs)
...
java.util.concurrent.ExecutionException:
com.google.common.util.concurrent.UncheckedExecutionException: com.google.gson.JsonIOException: java.util.zip.ZipException: invalid stored block lengths
...
org.opensha2.eq.model.ModelConfig$Builder.fromFile(ModelConfig.java:140)
org.opensha2.eq.model.Loader.load(Loader.java:90) org.opensha2.eq.model.HazardModel.load(HazardModel.java:86)
gov.usgs.earthquake.nshm.www.ServletUtil.loadModel(ServletUtil.java:156) gov.usgs.earthquake.nshm.www.ServletUtil.access$000(ServletUtil.java:49) gov.usgs.earthquake.nshm.www.ServletUtil$1.load(ServletUtil.java:97) gov.usgs.earthquake.nshm.www.ServletUtil$1.load(ServletUtil.java:94)
It looks like when nshmp-haz-ws is built, the call to git describe
when building nshmp-haz is being executed in the nshmp-haz-ws directory as the app.version
key in both app.properties
(in nshmp-haz.jar root) and service.properties
(in WEB-INF/classes) have the same value. Moreover, even though the values ar the same, the "service"
JSON metadata element does not appear to be picking up the nshmp-haz-ws version.
Per discussion, this plotter would be similar to GM v. Distance, with some differences:
Add script to repostitory that will read a properties file and update required repos to specified tags. Understand that this script could locally delete itself if the nshmp-haz-ws version specified in the properties file predates the release.
Placeholder issue...
See also usgs/nshmp-haz#216
For consistency in the earthquake-hazard-tool, nshmp-haz-ws specifies finite vs30 values that each model supports. In regions like the WUS, there is a strong demand to use values outside those specified.
If region==WUS
, allow any vs30 between min and max to pass through despite not matching a metadata Vs30 enum value.
Application assumes a vertical strike slip fault, normal to the line of section. Will work similar to spectra plot. Requirements:
There are two possible variants of this tool that can be developed from the base case above:
Further to the above, there is a variant that plots the fault below the data and permits studying geometric and hanging wall effects.
Currently they are displayed as:
For consistency, these should be:
Similar updates for other periods, currently only these three are supported... (?)
The deagg program supports deeggregating at return period and IML; service should be able to leverage this via deagg/ and deaggiml/ paths
Currently it appear the CORS headers are configured to use the defaults (which is to allow everything). This is okay by me, but the issue I'm having is while the header configuration for allowed origins is "*", the actual header value returned mirrors that of the request referrer host/protocol/port.
The issue with this is that in practice these responses are cached (headers and all) and we do not vary on the access-control-allow-origins header. In this way requests from different referrers for the same resources may causes XHR requests to fail as the returned allowed origin header may not match the request referrer header when the XHR is served by a cached response.
This is unlikely to often affect public requests unless web service requests become popular with external developers. This will affect developers regularly.
Currently, most metadata supplied to services is manually managed; this is due, in part, to the fact that models only load on demand. We should consider returning a usage string derived from available models, but this will require auto-loading all models, which will slow startup time.
To streamline local development internal use of nshmp-haz, add deploy step to build.xml conditioned on the deploy destination being defined as a property.
Related to nshmp-haz #203. Add code and model version information to JSON usage and result responses.
Create an MFD plotter to further support source model mining, and to support enhancements down the road to the fault source parameter dB pages. This requires backfilling source ID's consistent with fault source param dB tables and the ability to collapse logic-tree branch MFDs to a total MFD.
See also usgs/nshmp-haz#287
Per discussion, we agreed to add a third tab to plot windows that would:
"server"
element at the root of the response and should be rendered as, e.g., nshmp-haz: v1.2.0-17-g2b25de4
.Also consider adding flag to print process to print metadata, or not.
Edition version numbers are currently hard-coded in the Edition.enum and should instead be added to labels returned with service metadata dynamically.
For work on location/region/testSite widget, these classes should be self consistent.
Read properties from a user file; these will override any defined in the build file itself.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.