import

command
v2.44.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jan 6, 2023 License: Apache-2.0, BSD-3-Clause, MIT Imports: 7 Imported by: 0

Documentation

Overview

import is a pipeline example using the fhirio connector to bulk import FHIR resources from GCS into a given FHIR store.

Pre-requisites: 1. NDJSON-encoded FHIR resources stored in GCS. 2. Dataflow Runner enabled: https://cloud.google.com/dataflow/docs/quickstarts. 3. A Google Cloud FHIR store.

Running this pipeline requires providing a fully qualified GCS address (potentially containing wildcards) to where your FHIR resources are stored, a path to the FHIR store where the resources should be written to, in addition to the usual flags for the Dataflow runner.

An example command for executing this pipeline on GCP is as follows:

export PROJECT="$(gcloud config get-value project)"
export TEMP_LOCATION="gs://MY-BUCKET/temp"
export STAGING_LOCATION="gs://MY-BUCKET/staging"
export REGION="us-central1"
export SOURCE_GCS_LOCATION="gs://MY_BUCKET/path/to/resources/**"
export FHIR_STORE_PATH="MY_FHIR_STORE_PATH"
cd ./sdks/go
go run ./examples/fhirio/import/import.go \
  --runner=dataflow \
  --temp_location=$TEMP_LOCATION \
  --staging_location=$STAGING_LOCATION \
  --project=$PROJECT \
  --region=$REGION \
  --worker_harness_container_image=apache/beam_go_sdk:latest \
  --sourceGcsLocation=$SOURCE_GCS_LOCATION \
  --fhirStore=$FHIR_STORE_PATH

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL