Skip to content

Commit 70805fd

Browse files
authored
Merge pull request #7 from rajavaid77/rajavaid-fix-bedrock-agent-notebook
updated arch diagram and description
2 parents 5ebe519 + a13aaf0 commit 70805fd

File tree

2 files changed

+33
-16
lines changed

2 files changed

+33
-16
lines changed

20-Industry-Use-Cases/22-Medical-Claims-Processing/22_medical_claims_processing.ipynb

Lines changed: 33 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -34,22 +34,39 @@
3434
"id": "a7f3f1e9-684e-428d-9598-a018da023dfc",
3535
"metadata": {},
3636
"source": [
37-
"The architecture for the solution is shown below. The process begins with a Medical Claim Form (CMS 1500) received from a medical provider - \n",
3837
"\n",
39-
"- The claim form is stored in S3, An S3 Object Created event is matched by a configured EventBridge Rule.\n",
40-
"- EventBridge rule triggers an AWS Lambda function that in turn calls the `InvokeDataAutomationAsync` job in BDA along with a custom blueprint.\n",
41-
"- BDA uses the provided custom blueprint for the CMS 1500 form to extract the content of the claim.\n",
42-
"- BDA stores extracted claims data in S3 in a location that we provide, the corresponding event is matched by a configured EventBridge rule.\n",
43-
"- EventBridge rule triggers an AWS Lambda function that in turn invoke a Bedrock Agent configured to process the claim.\n",
44-
"- The Bedrock Agent is configured with an Action group, with business logic for the actions implemented in a AWS Lambda function.\n",
45-
"- The configured actions include fetching insured member and patient details as well as creating and updating claims records\n",
46-
"- The business logic in actions use a `claims database` implemented in Amazon Aurora (Postgres) for storing claims related data.\n",
47-
"- The agent is also associated with a Bedrock Knowledge Base is backed by an S3 data source.\n",
48-
"- We ingest a set of Explanation of Coverage (EoC) documents into the knowledge Base to mimic available multiple insurance plans with different levels of coverage.\n",
49-
"- With the EoC documents ingested, the Knowlegde Base can answer queries about insured member's insurance plan coverage details.\n",
50-
"- When Invoked, the agent goes through the steps to process the claim, including verifying member/patient details and creating claims record in the database.\n",
51-
"- The agent also gathers coverage details on the treatments, services and supplies provided in the claims form.\n",
52-
"- The agent finished with providing a report of the verification process that can be used by the claims adjudicator to decide on the outcome of the claim. "
38+
"\n",
39+
"### Claim Review Process\n",
40+
"The architecture for the claim review solution is shown below. The process begins with a Medical Claim Form (CMS 1500) received from a medical provider -\n",
41+
"\n",
42+
"- **Step 1:** Medical provider submits the claim, which is stored in an S3 bucket.\n",
43+
"\n",
44+
"- **Step 2:** An S3 Object Created event is matched by a configured EventBridge Rule.\n",
45+
"- **Step 3:** EventBridge rule triggers a AWS Lambda function.\n",
46+
"- **Step 4:** The Lambda function in turn calls the `InvokeDataAutomationAsync` job in BDA along with a custom blueprint.\n",
47+
"- **Step 5:** BDA uses the provided custom blueprint for the CMS 1500 form to extract the content of the -claim.BDA stores extracted claims data in the S3 bucket specified in the API call.\n",
48+
"- **Step 6:** BDA sends a Job completion event to EventBridge that include the job status and the S3 uri of the response and metadata.\n",
49+
"- **Step 7:** EventBridge rule triggers an AWS Lambda function\n",
50+
"- **Step 8:** The Lambda function uses the BDA job response and metadata to fetch extracted claim form data.\n",
51+
"- **Step 9:** The Lambda function then invokes the pre-configured Bedrock Agent to process the claim\n",
52+
"- **Step 10:** Bedrock Agent uses configured agent actions (fulfilled by Lambda function) to perform tasks required during claim verification process.\n",
53+
"- **Step 11:** Bedrock Agent uses agent action, implemented using Lambda function, to query member and patient information stored in Aurora Postgres in order to complete claim verification. Agent also uses the agent action to stores verified claim details in the database.\n",
54+
"- **Step 12:** Bedrock Agent query Bedrock knowledge Base to gathers coverage details on the treatments, services and supplies provided in the claims form\n",
55+
"- **Step 13:** Bedrock Agent create a final report to detail the validation process and stores the report in S3\n",
56+
"\n",
57+
"\n",
58+
"### Claim Policy Knowledge Base Ingestion Process\n",
59+
"The Bedrock Knowledge Base is used for Retrieval Augmented Generation (RAG), a technique that uses information from data sources to improve the relevancy and accuracy of generated responses. In this example, the knowledge base converts claim policy documents in pdf format into vector embeddings and stores them in a vector store (Amazon Opensearch serverless vector index). This process of converting the data into vector embeddings is called ingestion. The Ingestion process is carried out to make the knowledge base ready to be queried by the agent (or humans). \n",
60+
"\n",
61+
"[More details on the Ingestion process](https://docs.aws.amazon.com/bedrock/latest/userguide/kb-how-data.html)\n",
62+
"\n",
63+
"- **Step 1:** Medical provider uploads claim policy document to S3 bucket.\n",
64+
" \n",
65+
"- **Step 2:** An S3 Object Created event is matched by a configured EventBridge Rule.\n",
66+
"- **Step 3:** EventBridge rule triggers a AWS Lambda function.\n",
67+
"- **Step 4:** Lambda function in turn invokes `StartIngestionJob` api to trigger a Knowledge Base datasource sync job\n",
68+
"- **Step 5:** The datasource sync job incrementally converts the raw claim policy documents in the S3 bucket into vector embeddings, based on the vector embeddings model and configurations specified \n",
69+
"- **Step 6:** The vector embeddings are store in the configured vector store (opensearch serverless in this case)"
5370
]
5471
},
5572
{
@@ -1046,7 +1063,7 @@
10461063
"name": "python",
10471064
"nbconvert_exporter": "python",
10481065
"pygments_lexer": "ipython3",
1049-
"version": "3.11.11"
1066+
"version": "3.11.10"
10501067
}
10511068
},
10521069
"nbformat": 4,
341 KB
Loading

0 commit comments

Comments
 (0)