Code Monkey home page Code Monkey logo

Comments (8)

bashir2 avatar bashir2 commented on September 23, 2024 1

The problem you have faced is known, please see this issue and in particular the "expiry" problem. The root cause is that HAPI keeps a list of resources associated to a search and that expires after a certain time (I think by default it is one hour). That's why we want to implement better API based solutions (i.e., the above issue).

A few points:

  • If your FHIR server is HAPI, can you try the JDBC mode instead? For that you need to provide the HAPI DB --fhirDatabaseConfigPath (like this) and enable --jdbcModeHapi.
  • You are setting --parallelism=2; this means only two threads will be used. If you have more cores on your machine, please consider increasing parallelism.
  • Re. the controller not making any progress, if you can share the configuration and some logs of the controller, that would be helpful.

from fhir-data-pipes.

bashir2 avatar bashir2 commented on September 23, 2024

It should produce multiple resource types if the FHIR-store have those (it is part of the continuous build tests). From your arguments it seems you are using the FHIR-search API mode. In that case you should see log lines like this:

Number of resources for [RESOURCE_TYPE] search is [NUMBER]

Can you please check your logs and see what the number is for each resource type?

from fhir-data-pipes.

citizenrich avatar citizenrich commented on September 23, 2024

Sure. Here's the command and grep of that output

ryzen $ java -Xmx8192M -cp ./pipelines/batch/target/batch-bundled-0.1.0-SNAPSHOT.jar com.google.fhir.analytics.FhirEtl --fhirServerUrl=http://localhost:8080/fhir --outputParquetPath=/tmp/parquet/ --resourceList=Patient,Encounter,Observation,Condition,DiagnosticReport,MedicationStatement --batchSize=10 --jdbcModeHapi=false --jdbcModeEnabled=false | grep "Number of resources"
18:21:09.641 [main] INFO  c.g.fhir.analytics.FhirSearchUtil com.google.fhir.analytics.FhirSearchUtil.createSegments:239 - Number of resources for Patient search is 1639
18:21:55.841 [main] INFO  c.g.fhir.analytics.FhirSearchUtil com.google.fhir.analytics.FhirSearchUtil.createSegments:239 - Number of resources for Encounter search is 128015
18:22:42.283 [main] INFO  c.g.fhir.analytics.FhirSearchUtil com.google.fhir.analytics.FhirSearchUtil.createSegments:239 - Number of resources for Observation search is 571842
18:23:28.373 [main] INFO  c.g.fhir.analytics.FhirSearchUtil com.google.fhir.analytics.FhirSearchUtil.createSegments:239 - Number of resources for Condition search is 74596
18:24:14.496 [main] INFO  c.g.fhir.analytics.FhirSearchUtil com.google.fhir.analytics.FhirSearchUtil.createSegments:239 - Number of resources for DiagnosticReport search is 60200
18:24:14.503 [main] INFO  c.g.fhir.analytics.FhirSearchUtil com.google.fhir.analytics.FhirSearchUtil.createSegments:239 - Number of resources for MedicationStatement search is 0

from fhir-data-pipes.

bashir2 avatar bashir2 commented on September 23, 2024

Do you get any error messages? Is it possible to share the whole pipeline logs?

from fhir-data-pipes.

citizenrich avatar citizenrich commented on September 23, 2024

The log is huge. One repeated error is this:

11:56:57.305 [direct-runner-worker] ERROR c.g.fhir.analytics.FhirSearchUtil com.google.fhir.analytics.FhirSearchUtil.searchByUrl:70 - Failed to search for url: http://localhost:8080/fhir?_getpages=765a4163-6ec6-4987-906e-28e964ea17b5&_getpagesoffset=98170 ;  Exception: ca.uhn.fhir.rest.server.exceptions.InternalErrorException: HTTP 500 : Could not open JPA EntityManager for transaction; nested exception is org.hibernate.exception.JDBCConnectionException: Unable to acquire JDBC Connection

The command used is:

java -Xmx8192M -cp ./pipelines/batch/target/batch-bundled-0.1.0-SNAPSHOT.jar com.google.fhir.analytics.FhirEtl --fhirServerUrl=http://localhost:8080/fhir --outputParquetPath=/tmp/parquet/ --resourceList=Patient,Encounter,Observation,Condition,DiagnosticReport --batchSize=10 --jdbcModeHapi=false --jdbcModeEnabled=false

So, I am not sure why there would be a JDBC error when JDBC is explicitly not being used. It should be only using FHIR search.

from fhir-data-pipes.

citizenrich avatar citizenrich commented on September 23, 2024

The latest run gave only Condition and Patient. Here's the first 1000 lines of stdout.
https://pastebin.com/kyYfKq7n

from fhir-data-pipes.

bashir2 avatar bashir2 commented on September 23, 2024

The log is huge. One repeated error is this:

11:56:57.305 [direct-runner-worker] ERROR c.g.fhir.analytics.FhirSearchUtil com.google.fhir.analytics.FhirSearchUtil.searchByUrl:70 - Failed to search for url: http://localhost:8080/fhir?_getpages=765a4163-6ec6-4987-906e-28e964ea17b5&_getpagesoffset=98170 ;  Exception: ca.uhn.fhir.rest.server.exceptions.InternalErrorException: HTTP 500 : Could not open JPA EntityManager for transaction; nested exception is org.hibernate.exception.JDBCConnectionException: Unable to acquire JDBC Connection

This error is coming from the FHIR server, i.e., when we are fetching resources through the FHIR API (not JDBC), the server returns the above 500 code. It seems to be caused by DB connection starvation on the FHIR server side (HAPI?). My guess is that the size of the connection pool is bigger than what the DB supports by default. Can you share the configuration of your HAPI server? Also if you look at the error logs that you get from HAPI, it may tell us more what the problem is.

The command used is:

java -Xmx8192M -cp ./pipelines/batch/target/batch-bundled-0.1.0-SNAPSHOT.jar com.google.fhir.analytics.FhirEtl --fhirServerUrl=http://localhost:8080/fhir --outputParquetPath=/tmp/parquet/ --resourceList=Patient,Encounter,Observation,Condition,DiagnosticReport --batchSize=10 --jdbcModeHapi=false --jdbcModeEnabled=false

The other issue that I see with this command is that you are using the default "DirectRunner" which is not recommended for production use (we mostly use it for test purposes). If you have a large amount of data, please use --runner=FlinkRunner.

Also is there a particular reason that you don't use the pipeline/controller app and instead run the pipeline directly from the command line? It is fine if you prefer that, I am just curious because the controller app takes care of some of these details.

from fhir-data-pipes.

citizenrich avatar citizenrich commented on September 23, 2024

I redid my setup and have the following error:

19:03:30.163 [CHAIN DataSource (at Create.Values2/Read(CreateSource) (org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat)) -> FlatMap (FlatMap at FetchResources2/ParDo(Search)/ParMultiDo(Search)) -> FlatMap (FlatMap at FetchResources2/ParDo(Search)/ParMultiDo(Search).output) (1/2)#0] ERROR c.g.fhir.analytics.FhirSearchUtil com.google.fhir.analytics.FhirSearchUtil.searchByUrl:70 - Failed to search for url: http://localhost:8080/fhir?_getpages=19680b29-323f-4b39-921f-79a109ce9231&_getpagesoffset=625754 ;  Exception: ca.uhn.fhir.rest.server.exceptions.ResourceGoneException: HTTP 410 : HAPI-0417: Search ID "19680b29-323f-4b39-921f-79a109ce9231" does not exist and may have expired

Here are the steps that I took:

  • Clone and generate 1000 patients with Synthea.
  • Start HAPI, e.g. docker run -p 8080:8080 hapiproject/hapi:latest
  • Load the patient bundles into HAPI. This takes a while.
  • Ensure Java 17 is being used by setting env classpath
  • Build the JAR file for pipelines. No errors. All tests pass.
  • Run pipelines, e.g. java -Xmx8192M -cp ./pipelines/batch/target/batch-bundled.jar com.google.fhir.analytics.FhirEtl --fhirServerUrl=http://localhost:8080/fhir --outputParquetPath=/tmp/parquet/ --resourceList=Patient,Encounter,Observation,Condition,DiagnosticReport --batchSize=2 --jdbcModeHapi=false --jdbcModeEnabled=false --runner=FlinkRunner --parallelism=2

I need to access the raw parquet files to import them into duckdb. When I use the docker compose method in the docs it hangs after 13% and doesn't recover in several hours.

from fhir-data-pipes.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.