lirony commited on
Commit
ff5e06a
·
0 Parent(s):
.dockerignore ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ .git
2
+ *.pyc
3
+ __pycache__/
4
+ .dockerignore
5
+ Dockerfile
6
+ iterative_langgraph_Agentic_RAG_with_MedGemma.ipynb
.gitattributes ADDED
@@ -0,0 +1,38 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ *.7z filter=lfs diff=lfs merge=lfs -text
2
+ *.arrow filter=lfs diff=lfs merge=lfs -text
3
+ *.bin filter=lfs diff=lfs merge=lfs -text
4
+ *.bz2 filter=lfs diff=lfs merge=lfs -text
5
+ *.ckpt filter=lfs diff=lfs merge=lfs -text
6
+ *.ftz filter=lfs diff=lfs merge=lfs -text
7
+ *.gz filter=lfs diff=lfs merge=lfs -text
8
+ *.h5 filter=lfs diff=lfs merge=lfs -text
9
+ *.joblib filter=lfs diff=lfs merge=lfs -text
10
+ *.lfs.* filter=lfs diff=lfs merge=lfs -text
11
+ *.mlmodel filter=lfs diff=lfs merge=lfs -text
12
+ *.model filter=lfs diff=lfs merge=lfs -text
13
+ *.msgpack filter=lfs diff=lfs merge=lfs -text
14
+ *.npy filter=lfs diff=lfs merge=lfs -text
15
+ *.npz filter=lfs diff=lfs merge=lfs -text
16
+ *.onnx filter=lfs diff=lfs merge=lfs -text
17
+ *.ot filter=lfs diff=lfs merge=lfs -text
18
+ *.parquet filter=lfs diff=lfs merge=lfs -text
19
+ *.pb filter=lfs diff=lfs merge=lfs -text
20
+ *.pickle filter=lfs diff=lfs merge=lfs -text
21
+ *.pkl filter=lfs diff=lfs merge=lfs -text
22
+ *.pt filter=lfs diff=lfs merge=lfs -text
23
+ *.pth filter=lfs diff=lfs merge=lfs -text
24
+ *.rar filter=lfs diff=lfs merge=lfs -text
25
+ *.safetensors filter=lfs diff=lfs merge=lfs -text
26
+ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
27
+ *.tar.* filter=lfs diff=lfs merge=lfs -text
28
+ *.tar filter=lfs diff=lfs merge=lfs -text
29
+ *.tflite filter=lfs diff=lfs merge=lfs -text
30
+ *.tgz filter=lfs diff=lfs merge=lfs -text
31
+ *.wasm filter=lfs diff=lfs merge=lfs -text
32
+ *.xz filter=lfs diff=lfs merge=lfs -text
33
+ *.zip filter=lfs diff=lfs merge=lfs -text
34
+ *.zst filter=lfs diff=lfs merge=lfs -text
35
+ *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ fhir_cache.db filter=lfs diff=lfs merge=lfs -text
37
+ llm_cache.db filter=lfs diff=lfs merge=lfs -text
38
+ static/background.jpg filter=lfs diff=lfs merge=lfs -text
.gitignore ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ __pycache__/
2
+ *.pyc
3
+ *.log
4
+ jules-scratch/
5
+ force_push*.sh
6
+ test_*.sh
7
+ *_git.sh
Dockerfile ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Use an official Python runtime as a parent image
2
+ FROM python:3.10-slim
3
+
4
+ # Set the working directory in the container
5
+ WORKDIR /app
6
+
7
+ ENV PYTHONUNBUFFERED=1
8
+ ENV GOOGLE_APPLICATION_CREDENTIALS=/app/key.json
9
+
10
+ # Copy the dependencies file to the working directory
11
+ COPY requirements.txt .
12
+
13
+ # Install any needed packages specified in requirements.txt
14
+ RUN pip install --no-cache-dir -r requirements.txt
15
+
16
+ # Copy the rest of the application code to the working directory
17
+ COPY . .
18
+
19
+ # Make port 8080 available to the world outside this container
20
+ EXPOSE 8080
21
+
22
+ # Run app.py when the container launches
23
+ CMD printf "%s" "$SERVICE_ACC_KEY" > /app/key.json && gunicorn --bind 0.0.0.0:8080 --timeout 1000 --workers 4 --threads 4 --log-file - app:app
LICENSE ADDED
@@ -0,0 +1,201 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Apache License
2
+ Version 2.0, January 2004
3
+ http://www.apache.org/licenses/
4
+
5
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
6
+
7
+ 1. Definitions.
8
+
9
+ "License" shall mean the terms and conditions for use, reproduction,
10
+ and distribution as defined by Sections 1 through 9 of this document.
11
+
12
+ "Licensor" shall mean the copyright owner or entity authorized by
13
+ the copyright owner that is granting the License.
14
+
15
+ "Legal Entity" shall mean the union of the acting entity and all
16
+ other entities that control, are controlled by, or are under common
17
+ control with that entity. For the purposes of this definition,
18
+ "control" means (i) the power, direct or indirect, to cause the
19
+ direction or management of such entity, whether by contract or
20
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
21
+ outstanding shares, or (iii) beneficial ownership of such entity.
22
+
23
+ "You" (or "Your") shall mean an individual or Legal Entity
24
+ exercising permissions granted by this License.
25
+
26
+ "Source" form shall mean the preferred form for making modifications,
27
+ including but not limited to software source code, documentation
28
+ source, and configuration files.
29
+
30
+ "Object" form shall mean any form resulting from mechanical
31
+ transformation or translation of a Source form, including but
32
+ not limited to compiled object code, generated documentation,
33
+ and conversions to other media types.
34
+
35
+ "Work" shall mean the work of authorship, whether in Source or
36
+ Object form, made available under the License, as indicated by a
37
+ copyright notice that is included in or attached to the work
38
+ (an example is provided in the Appendix below).
39
+
40
+ "Derivative Works" shall mean any work, whether in Source or Object
41
+ form, that is based on (or derived from) the Work and for which the
42
+ editorial revisions, annotations, elaborations, or other modifications
43
+ represent, as a whole, an original work of authorship. For the purposes
44
+ of this License, Derivative Works shall not include works that remain
45
+ separable from, or merely link (or bind by name) to the interfaces of,
46
+ the Work and Derivative Works thereof.
47
+
48
+ "Contribution" shall mean any work of authorship, including
49
+ the original version of the Work and any modifications or additions
50
+ to that Work or Derivative Works thereof, that is intentionally
51
+ submitted to Licensor for inclusion in the Work by the copyright owner
52
+ or by an individual or Legal Entity authorized to submit on behalf of
53
+ the copyright owner. For the purposes of this definition, "submitted"
54
+ means any form of electronic, verbal, or written communication sent
55
+ to the Licensor or its representatives, including but not limited to
56
+ communication on electronic mailing lists, source code control systems,
57
+ and issue tracking systems that are managed by, or on behalf of, the
58
+ Licensor for the purpose of discussing and improving the Work, but
59
+ excluding communication that is conspicuously marked or otherwise
60
+ designated in writing by the copyright owner as "Not a Contribution."
61
+
62
+ "Contributor" shall mean Licensor and any individual or Legal Entity
63
+ on behalf of whom a Contribution has been received by Licensor and
64
+ subsequently incorporated within the Work.
65
+
66
+ 2. Grant of Copyright License. Subject to the terms and conditions of
67
+ this License, each Contributor hereby grants to You a perpetual,
68
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
69
+ copyright license to reproduce, prepare Derivative Works of,
70
+ publicly display, publicly perform, sublicense, and distribute the
71
+ Work and such Derivative Works in Source or Object form.
72
+
73
+ 3. Grant of Patent License. Subject to the terms and conditions of
74
+ this License, each Contributor hereby grants to You a perpetual,
75
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
76
+ (except as stated in this section) patent license to make, have made,
77
+ use, offer to sell, sell, import, and otherwise transfer the Work,
78
+ where such license applies only to those patent claims licensable
79
+ by such Contributor that are necessarily infringed by their
80
+ Contribution(s) alone or by combination of their Contribution(s)
81
+ with the Work to which such Contribution(s) was submitted. If You
82
+ institute patent litigation against any entity (including a
83
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
84
+ or a Contribution incorporated within the Work constitutes direct
85
+ or contributory patent infringement, then any patent licenses
86
+ granted to You under this License for that Work shall terminate
87
+ as of the date such litigation is filed.
88
+
89
+ 4. Redistribution. You may reproduce and distribute copies of the
90
+ Work or Derivative Works thereof in any medium, with or without
91
+ modifications, and in Source or Object form, provided that You
92
+ meet the following conditions:
93
+
94
+ (a) You must give any other recipients of the Work or
95
+ Derivative Works a copy of this License; and
96
+
97
+ (b) You must cause any modified files to carry prominent notices
98
+ stating that You changed the files; and
99
+
100
+ (c) You must retain, in the Source form of any Derivative Works
101
+ that You distribute, all copyright, patent, trademark, and
102
+ attribution notices from the Source form of the Work,
103
+ excluding those notices that do not pertain to any part of
104
+ the Derivative Works; and
105
+
106
+ (d) If the Work includes a "NOTICE" text file as part of its
107
+ distribution, then any Derivative Works that You distribute must
108
+ include a readable copy of the attribution notices contained
109
+ within such NOTICE file, excluding those notices that do not
110
+ pertain to any part of the Derivative Works, in at least one
111
+ of the following places: within a NOTICE text file distributed
112
+ as part of the Derivative Works; within the Source form or
113
+ documentation, if provided along with the Derivative Works; or,
114
+ within a display generated by the Derivative Works, if and
115
+ wherever such third-party notices normally appear. The contents
116
+ of the NOTICE file are for informational purposes only and
117
+ do not modify the License. You may add Your own attribution
118
+ notices within Derivative Works that You distribute, alongside
119
+ or as an addendum to the NOTICE text from the Work, provided
120
+ that such additional attribution notices cannot be construed
121
+ as modifying the License.
122
+
123
+ You may add Your own copyright statement to Your modifications and
124
+ may provide additional or different license terms and conditions
125
+ for use, reproduction, or distribution of Your modifications, or
126
+ for any such Derivative Works as a whole, provided Your use,
127
+ reproduction, and distribution of the Work otherwise complies with
128
+ the conditions stated in this License.
129
+
130
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
131
+ any Contribution intentionally submitted for inclusion in the Work
132
+ by You to the Licensor shall be under the terms and conditions of
133
+ this License, without any additional terms or conditions.
134
+ Notwithstanding the above, nothing herein shall supersede or modify
135
+ the terms of any separate license agreement you may have executed
136
+ with Licensor regarding such Contributions.
137
+
138
+ 6. Trademarks. This License does not grant permission to use the trade
139
+ names, trademarks, service marks, or product names of the Licensor,
140
+ except as required for reasonable and customary use in describing the
141
+ origin of the Work and reproducing the content of the NOTICE file.
142
+
143
+ 7. Disclaimer of Warranty. Unless required by applicable law or
144
+ agreed to in writing, Licensor provides the Work (and each
145
+ Contributor provides its Contributions) on an "AS IS" BASIS,
146
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
147
+ implied, including, without limitation, any warranties or conditions
148
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
149
+ PARTICULAR PURPOSE. You are solely responsible for determining the
150
+ appropriateness of using or redistributing the Work and assume any
151
+ risks associated with Your exercise of permissions under this License.
152
+
153
+ 8. Limitation of Liability. In no event and under no legal theory,
154
+ whether in tort (including negligence), contract, or otherwise,
155
+ unless required by applicable law (such as deliberate and grossly
156
+ negligent acts) or agreed to in writing, shall any Contributor be
157
+ liable to You for damages, including any direct, indirect, special,
158
+ incidental, or consequential damages of any character arising as a
159
+ result of this License or out of the use or inability to use the
160
+ Work (including but not limited to damages for loss of goodwill,
161
+ work stoppage, computer failure or malfunction, or any and all
162
+ other commercial damages or losses), even if such Contributor
163
+ has been advised of the possibility of such damages.
164
+
165
+ 9. Accepting Warranty or Additional Liability. While redistributing
166
+ the Work or Derivative Works thereof, You may choose to offer,
167
+ and charge a fee for, acceptance of support, warranty, indemnity,
168
+ or other liability obligations and/or rights consistent with this
169
+ License. However, in accepting such obligations, You may act only
170
+ on Your own behalf and on Your sole responsibility, not on behalf
171
+ of any other Contributor, and only if You agree to indemnify,
172
+ defend, and hold each Contributor harmless for any liability
173
+ incurred by, or claims asserted against, such Contributor by reason
174
+ of your accepting any such warranty or additional liability.
175
+
176
+ END OF TERMS AND CONDITIONS
177
+
178
+ APPENDIX: How to apply the Apache License to your work.
179
+
180
+ To apply the Apache License to your work, attach the following
181
+ boilerplate notice, with the fields enclosed by brackets "[]"
182
+ replaced with your own identifying information. (Don't include
183
+ the brackets!) The text should be enclosed in the appropriate
184
+ comment syntax for the file format. We also recommend that a
185
+ file or class name and description of purpose be included on the
186
+ same "printed page" as the copyright notice for easier
187
+ identification within third-party archives.
188
+
189
+ Copyright [yyyy] [name of copyright owner]
190
+
191
+ Licensed under the Apache License, Version 2.0 (the "License");
192
+ you may not use this file except in compliance with the License.
193
+ You may obtain a copy of the License at
194
+
195
+ http://www.apache.org/licenses/LICENSE-2.0
196
+
197
+ Unless required by applicable law or agreed to in writing, software
198
+ distributed under the License is distributed on an "AS IS" BASIS,
199
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
200
+ See the License for the specific language governing permissions and
201
+ limitations under the License.
README.md ADDED
@@ -0,0 +1,48 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ title: EHR Navigator Agent With MedGemma
3
+ emoji: 🩺
4
+ colorFrom: blue
5
+ colorTo: yellow
6
+ sdk: docker
7
+ pinned: false
8
+ app_port: 8080
9
+ license: apache-2.0
10
+ tags:
11
+ - medgemma
12
+ - healthcare
13
+ - ehr
14
+ - fhir
15
+ - agent
16
+ - google
17
+ ---
18
+ > **Disclaimer** This demonstration is for illustrative purposes only and does not represent a finished or approved product. It is not representative of compliance to any regulations or standards for quality, safety or efficacy. Any real-world application would require additional development, training, and adaptation. The experience highlighted in this demo shows MedGemma's baseline capability for the displayed task and is intended to help developers and users explore possible applications and inspire further development.
19
+ This is not an officially supported Google product. This project is not eligible for the Google Open Source Software Vulnerability Rewards Program.
20
+
21
+ # EHR Navigator Agent
22
+
23
+ In a clinical setting, agents are crucial for navigating and utilizing vast Electronic Health Record (EHR) data, often stored in FHIR format. An agent can efficiently answer specific questions or perform tasks related to a patient by intelligently fetching the most relevant information from their potentially very large and complex record.
24
+
25
+ This demo showcases how an agent can use MedGemma’s comprehension of Fast Healthcare Interoperability Resources (FHIR) standard to intelligently navigate patient's health records. The agent first identifies what information is available, then plans how to retrieve the relevant parts. It fetches data in steps, extracting key facts along the way, and finally combines all these facts to provide a complete answer. This is a simplified example to illustrate the process. All patient data in this demo is synthetic, generated by Synthea (github.com/synthetichealth/synthea). The data is accessible via [this FHIR store](https://console.cloud.google.com/healthcare/fhirviewer/us-central1/public/fhirStores/synthetic-patients/browse?project=hai-cd3-foundations).
26
+
27
+ ### Caching
28
+ This demo is functional, and results are persistently cached to reduce environmental impact.
29
+
30
+ ### Resources
31
+
32
+ * This demo is available as a colab following the same logic here:
33
+ [github.com/Google-Health/medgemma/tree/main/notebooks/ehr_navigator_agent.ipynb](https://github.com/Google-Health/medgemma/tree/main/notebooks/ehr_navigator_agent.ipynb)
34
+
35
+ * The EHR FHIR data used in this demo is synthetic, generated by Synthea
36
+ ([github.com/synthetichealth/synthea](https://github.com/synthetichealth/synthea)).
37
+ The data is accessible via
38
+ [this FHIR store](https://console.cloud.google.com/healthcare/fhirviewer/us-central1/public/fhirStores/synthetic-patients/browse?project=hai-cd3-foundations)
39
+ or as a FHIR API endpoint at
40
+ `https://healthcare.googleapis.com/v1/projects/hai-cd3-foundations/locations/us-central1/datasets/public/fhirStores/synthetic-patients/fhir`
41
+
42
+ * See other HAI-DEF demos here: [HuggingFace Collection](https://huggingface.co/collections/google/hai-def-concept-apps-6837acfccce400abe6ec26c1)
43
+
44
+ ### Contacts
45
+ This demo is part of Google's [Health AI Developer Foundations (HAI-DEF)](https://developers.google.com/health-ai-developer-foundations)
46
+
47
+ * **Technical info:** Liron Yatziv [@lirony](https://huggingface.co/lirony)
48
+ * **Press only:** press@google.com
app.py ADDED
@@ -0,0 +1,268 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ from flask import Flask, Response, jsonify, render_template, request, send_file
3
+ from langchain_core.messages import HumanMessage, SystemMessage
4
+ from langchain_google_vertexai import VertexAIModelGarden
5
+ from modules.agent import create_agent
6
+
7
+ app = Flask(__name__)
8
+
9
+ # --- Configuration ---
10
+ import re
11
+ import json
12
+
13
+ llm_endpoint = os.environ.get("LLM_ENDPOINT")
14
+ if llm_endpoint:
15
+ match = re.search(
16
+ r"projects/([^/]+)/locations/([^/]+)/endpoints/([^/]+)", llm_endpoint
17
+ )
18
+ if match:
19
+ YOUR_PROJECT_ID, YOUR_REGION, YOUR_ENDPOINT_ID = match.groups()
20
+ else:
21
+ YOUR_PROJECT_ID = os.environ.get("YOUR_PROJECT_ID")
22
+ YOUR_REGION = os.environ.get("YOUR_REGION", "us-central1")
23
+ YOUR_ENDPOINT_ID = os.environ.get("YOUR_ENDPOINT_ID", "1030")
24
+ else:
25
+ YOUR_PROJECT_ID = os.environ.get("YOUR_PROJECT_ID")
26
+ YOUR_REGION = os.environ.get("YOUR_REGION", "us-central1")
27
+ YOUR_ENDPOINT_ID = os.environ.get("YOUR_ENDPOINT_ID", "1030")
28
+ FHIR_STORE_URL = os.environ.get("FHIR_STORE_URL")
29
+
30
+ # --- Hardcoded Questions ---
31
+ PREDEFINED_QUESTIONS = [
32
+ {
33
+ "id": "q1",
34
+ "question": (
35
+ "What were the results and dates of the patient's lastest "
36
+ "lipid panel and CBC tests?"
37
+ ),
38
+ "patient_id": "c1ae6e14-1833-a8e2-8e26-e0508236994a",
39
+ },
40
+ {
41
+ "id": "q2",
42
+ "question": (
43
+ "What specific medications were administered to the patient during"
44
+ " their sepsis encounter?"
45
+ ),
46
+ "patient_id": "e4350e97-bb8c-70b7-9997-9e098cfacef8",
47
+ },
48
+ ]
49
+
50
+ # --- LLM and Agent Initialization ---
51
+ try:
52
+ import google.auth
53
+ from langchain_community.cache import SQLiteCache
54
+ from langchain_core.globals import set_llm_cache
55
+
56
+ if os.environ.get("SERVICE_ACC_KEY"):
57
+ credentials, project_id = google.auth.default(
58
+ scopes=[
59
+ "https://www.googleapis.com/auth/cloud-platform",
60
+ "https://www.googleapis.com/auth/cloud-healthcare",
61
+ ]
62
+ )
63
+ llm = VertexAIModelGarden(
64
+ project=YOUR_PROJECT_ID or project_id,
65
+ location=YOUR_REGION,
66
+ endpoint_id=YOUR_ENDPOINT_ID,
67
+ credentials=credentials,
68
+ allowed_model_args=["temperature", "max_tokens"],
69
+ )
70
+ else:
71
+ from langchain_core.language_models.fake import FakeListLLM
72
+ responses = [
73
+ '{"name": "get_patient_data_manifest", "args": {"patient_id": "c1ae6e14-1833-a8e2-8e26-e0508236994a"}}',
74
+ '[]',
75
+ 'Dummy answer'
76
+ ]
77
+ llm = FakeListLLM(responses=responses)
78
+ print("⚠️ Using dummy LLM since SERVICE_ACC_KEY is not provided.")
79
+
80
+ set_llm_cache(SQLiteCache(database_path="llm_cache.db"))
81
+ agent = create_agent(llm, FHIR_STORE_URL)
82
+ print("✅ LLM and Agent Initialization successful.")
83
+ except Exception as e:
84
+ print(f"❌ LLM and Agent Initialization FAILED: {e}")
85
+ llm = None
86
+ agent = None
87
+
88
+
89
+ # --- Routes ---
90
+ @app.route("/")
91
+ def index():
92
+ return render_template("index.html")
93
+
94
+
95
+ @app.route("/questions")
96
+ def get_questions():
97
+ return jsonify(PREDEFINED_QUESTIONS)
98
+
99
+
100
+ @app.route("/run_agent")
101
+ def run_agent():
102
+ if not agent:
103
+ return jsonify({"error": "Agent not initialized"}), 500
104
+
105
+ question_id = request.args.get("question_id")
106
+
107
+ selected_question = next(
108
+ (q for q in PREDEFINED_QUESTIONS if q["id"] == question_id), None
109
+ )
110
+
111
+ if not selected_question:
112
+ return jsonify({"error": "Invalid question ID"}), 400
113
+
114
+ composed_question = (
115
+ f"{selected_question['question']}. Patient ID"
116
+ f" {selected_question['patient_id']}."
117
+ )
118
+
119
+ def generate():
120
+ system_prompt = "SYSTEM INSTRUCTION: think silently if needed."
121
+ messages = [
122
+ SystemMessage(content=system_prompt),
123
+ HumanMessage(content=composed_question),
124
+ ]
125
+ inputs = {
126
+ "messages": messages,
127
+ "patient_fhir_manifest": {},
128
+ "tool_output_summary": [],
129
+ }
130
+
131
+ def yield_event(request, destination, event, final=False, data=""):
132
+ return (
133
+ "data:"
134
+ f" {json.dumps({'request': request, 'destination': destination, 'event': event, 'final': final, 'data': data})}\n\n"
135
+ )
136
+
137
+ # JSON output:
138
+ # {
139
+ # "request": boolean,
140
+ # "destination": string,
141
+ # "event": string,
142
+ # "final": boolean,
143
+ # "data": string
144
+ # }
145
+
146
+ yield yield_event(
147
+ request=True, destination="LLM", event="Define manifest tool to LLM"
148
+ )
149
+ for event in agent.stream(inputs):
150
+ if "generate_manifest_tool_call" in event:
151
+ yield yield_event(
152
+ request=False, destination="LLM", event="Tool call generated", data=event["generate_manifest_tool_call"]["tool_call"],
153
+ )
154
+ yield yield_event(
155
+ request=True, destination="FHIR", event="Get patient resources"
156
+ )
157
+ elif "execute_manifest_tool_call" in event:
158
+ yield yield_event(
159
+ request=False,
160
+ destination="FHIR",
161
+ event="Patient resources received. Agent creating manifest.",
162
+ data=event["execute_manifest_tool_call"]["patient_fhir_manifest"],
163
+ )
164
+ elif "identify_relevant_resource_types" in event:
165
+ yield yield_event(
166
+ request=True,
167
+ destination="LLM",
168
+ event="Identify relevant FHIR resources",
169
+ )
170
+ resources = event["identify_relevant_resource_types"].get(
171
+ "relevant_resource_types", []
172
+ )
173
+ yield yield_event(
174
+ request=False,
175
+ destination="LLM",
176
+ event="Selected FHIR resources to use",
177
+ data=resources,
178
+ )
179
+ elif "announce_sdt" in event:
180
+ node_output = event["announce_sdt"]
181
+ resource_type = node_output.get("resource_type_to_process")
182
+ yield yield_event(
183
+ request=True,
184
+ destination="LLM",
185
+ event=f"Select data for {resource_type} resource",
186
+ data=node_output.get("resource_manifest_codes"),
187
+ )
188
+ elif "select_data_to_retrieve" in event:
189
+ node_output = event["select_data_to_retrieve"]
190
+ resource_type = node_output.get("resource_type_processed")
191
+ tool_call = node_output.get("tool_calls_to_execute")
192
+ if tool_call:
193
+ yield yield_event(
194
+ request=False,
195
+ destination="LLM",
196
+ event=f"Tool call: retrieve {resource_type} resource with filter codes",
197
+ data=tool_call,
198
+ )
199
+
200
+ elif "init_edr_idx" in event:
201
+ yield yield_event(
202
+ request=True,
203
+ destination="FHIR",
204
+ event="Retrieve resources from FHIR store",
205
+ )
206
+ elif "execute_data_retrieval" in event:
207
+ node_output = event["execute_data_retrieval"]
208
+ resource_type = node_output.get("resource_type_retrieved")
209
+ yield yield_event(
210
+ request=False,
211
+ destination="FHIR",
212
+ event=f"{resource_type} resource received.",
213
+ )
214
+ elif "announce_summarization" in event:
215
+ node_output = event["announce_summarization"]
216
+ resource_type = node_output.get("resource_being_summarized")
217
+ yield yield_event(
218
+ request=True,
219
+ destination="LLM",
220
+ event=f"Extract concise facts for {resource_type} resource",
221
+ )
222
+ elif "summarize_node" in event:
223
+ node_output = event["summarize_node"]
224
+ if "tool_output_summary" in node_output:
225
+ resource_type = node_output.get("resource_type_retrieved")
226
+ yield yield_event(
227
+ request=False,
228
+ destination="LLM",
229
+ event=f"{resource_type} concise facts received.",
230
+ data=f'...{node_output.get("tool_output_summary")[0][-200:]}'
231
+ )
232
+ elif "final_answer" in event:
233
+ yield yield_event(
234
+ request=True, destination="LLM", event="Generate final answer"
235
+ )
236
+ final_response = event["final_answer"]["messages"][-1].content
237
+ final_response = final_response.removesuffix("```").removeprefix(
238
+ "```markdown"
239
+ )
240
+ yield yield_event(
241
+ request=False,
242
+ destination="LLM",
243
+ event="Final Answer",
244
+ final=True,
245
+ data=final_response,
246
+ )
247
+
248
+ return Response(generate(), mimetype="text/event-stream")
249
+
250
+
251
+ @app.route("/download")
252
+ def download_cache():
253
+ try:
254
+ return send_file("llm_cache.db", as_attachment=True)
255
+ except Exception as e:
256
+ return str(e), 404
257
+
258
+
259
+ @app.route("/download_fhir_cache")
260
+ def download_fhir_cache():
261
+ try:
262
+ return send_file("fhir_cache.db", as_attachment=True)
263
+ except Exception as e:
264
+ return str(e), 404
265
+
266
+
267
+ if __name__ == "__main__":
268
+ app.run(debug=True, port=8080)
fhir_cache.db ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b29e0a8e0de51bfa14721240f73ad4df6829c4c232bbc57e44da9045c03e1ef9
3
+ size 1200128
llm_cache.db ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c75191e179455f25dadd8a6a8a1426adc964d23cb635928ae94aececcd2337c8
3
+ size 274432
modules/agent.py ADDED
@@ -0,0 +1,417 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import json
2
+ import operator
3
+ import re
4
+ from typing import Annotated, TypedDict
5
+
6
+ from langchain_core.messages import AIMessage
7
+ from langchain_core.tools import render_text_description
8
+ from langgraph.graph import END, StateGraph
9
+ from langgraph.prebuilt import ToolNode
10
+ from modules.tools import (
11
+ get_patient_data_manifest,
12
+ get_patient_fhir_resource,
13
+ )
14
+
15
+ _LLM_INVOKE_ARGS = {"max_tokens": 8000, "temperature": 0.6}
16
+
17
+
18
+ def exclude_thinking_component(text: str) -> str:
19
+ """Removes the thinking block (delimited by <unused94> and <unused95>) from a string."""
20
+ return re.sub(r"<unused94>.*?<unused95>", "", text, flags=re.DOTALL).strip()
21
+
22
+
23
+ def strip_json_decoration(text: str) -> str:
24
+ """Removes JSON markdown fences from the start and end of a string."""
25
+ match = re.search(r"```(?:json)?\s*([\{\[].*[\]\}])\s*```", text, re.DOTALL)
26
+ if match:
27
+ return match.group(1)
28
+ return text.strip()
29
+
30
+
31
+ class AgentState(TypedDict):
32
+ messages: Annotated[list, operator.add]
33
+ patient_fhir_manifest: dict
34
+ tool_output_summary: Annotated[list, operator.add]
35
+ tool_calls_to_execute: Annotated[list, operator.add]
36
+ relevant_resource_types: list
37
+ manifest_tool_call_request: AIMessage
38
+ sdt_idx: int
39
+ edr_idx: int
40
+ resource_type_processed: str
41
+ resource_type_retrieved: str
42
+ summary_generated: bool
43
+ resource_type_to_retrieve: str
44
+ resource_type_to_process: str
45
+ fhir_tool_output: str
46
+ resource_being_summarized: str
47
+ tool_call: dict
48
+ resource_manifest_codes: list
49
+
50
+
51
+ def create_agent(llm, fhir_store_url):
52
+ """Creates and compiles the LangGraph agent."""
53
+
54
+ manifest_tool_node = ToolNode([get_patient_data_manifest])
55
+ data_retrieval_tool_node = ToolNode([get_patient_fhir_resource])
56
+
57
+ def generate_manifest_tool_call_node(state):
58
+ """The first step: uses the LLM to find the patient_id from the initial question
59
+
60
+ and generates a tool call for get_patient_data_manifest.
61
+ """
62
+ last_message = state["messages"][-1]
63
+ extraction_prompt = (
64
+ f"USER QUESTION: {last_message.content}\\n\\nYou are an API request"
65
+ " generator. Your task is to identify the patient ID from the user's"
66
+ " question and output a JSON object to call the"
67
+ " `get_patient_data_manifest` tool.\\n\\nYour available tool"
68
+ f" is:\\n{render_text_description([get_patient_data_manifest])}\\n\\nGenerate"
69
+ " the correct JSON to call the tool. Respond with only a single, raw"
70
+ ' JSON object.\\n\\nEXAMPLE:\\n{\\n "name":'
71
+ ' "get_patient_data_manifest",\\n "args": {\\n "patient_id":'
72
+ ' "some-patient-id-from-the-question"\\n }\\n}\\n'
73
+ )
74
+ print(
75
+ "--- generate_manifest_tool_call_node PROMPT"
76
+ f" ---\n{extraction_prompt}\n-----------------------------"
77
+ )
78
+ response_str = llm.invoke(extraction_prompt, **_LLM_INVOKE_ARGS)
79
+ print(
80
+ "--- generate_manifest_tool_call_node RESPONSE"
81
+ f" ---\n{response_str}\n------------------------------"
82
+ )
83
+ try:
84
+ cleaned_response = strip_json_decoration(response_str)
85
+ tool_call_json = json.loads(cleaned_response)
86
+ tool_call_json["args"]["fhir_store_url"] = fhir_store_url
87
+ tool_call_msg = AIMessage(
88
+ content="",
89
+ tool_calls=[{**tool_call_json, "id": "manifest_call"}],
90
+ )
91
+ return {
92
+ "manifest_tool_call_request": tool_call_msg,
93
+ "tool_call": tool_call_json,
94
+ }
95
+ except Exception as e:
96
+ print(f"Error generating manifest tool call: {e}")
97
+ raise e
98
+
99
+ def execute_manifest_tool_call_node(state):
100
+ """Executes the get_patient_data_manifest tool call and puts the result in state."""
101
+ try:
102
+ tool_call_msg = state["manifest_tool_call_request"]
103
+ tool_output_message = manifest_tool_node.invoke([tool_call_msg])[0]
104
+ manifest_dict = json.loads(tool_output_message.content)
105
+ print(f"Manifest dict: {manifest_dict}")
106
+ return {"patient_fhir_manifest": manifest_dict}
107
+ except Exception as e:
108
+ print(f"Error calling manifest tool: {e}")
109
+ raise e
110
+
111
+ def identify_relevant_resource_types(state):
112
+ """Uses the manifest and user question to identify relevant FHIR resource types."""
113
+ print("Identifying Relevant Resource Types")
114
+ manifest = state.get("patient_fhir_manifest", {})
115
+ user_question = state["messages"][1].content
116
+ manifest_content = ""
117
+ for resource_type, codes in manifest.items():
118
+ manifest_content += f"**{resource_type}**: "
119
+ if codes:
120
+ manifest_content += f"Available codes include: {', '.join(codes)}\\n"
121
+ else:
122
+ manifest_content += "Present (no specific codes found)\\n"
123
+ prompt = (
124
+ "SYSTEM INSTRUCTION: think silently if needed.\\nUSER QUESTION:"
125
+ f" {user_question}\\n\\nPATIENT DATA"
126
+ f" MANIFEST:\\n{manifest_content}\\n\\nYou are a medical assistant"
127
+ " analyzing a patient's FHIR data manifest to answer a user"
128
+ " question.\\nBased on the user question, identify the specific FHIR"
129
+ " resource types from the manifest that are most likely to contain the"
130
+ " information needed to answer the question.\\nOutput a JSON list of"
131
+ " the relevant resource types. Do not include any other text or"
132
+ ' formatting.\\nExample:\n["Condition", "Observation",'
133
+ ' "MedicationRequest"]\n'
134
+ )
135
+ print(
136
+ "--- identify_relevant_resource_types PROMPT"
137
+ f" ---\n{prompt}\n------------------------------------------"
138
+ )
139
+ response_str = llm.invoke(prompt, **_LLM_INVOKE_ARGS)
140
+ print(
141
+ "--- identify_relevant_resource_types RESPONSE"
142
+ f" ---\n{response_str}\n-------------------------------------------"
143
+ )
144
+ try:
145
+ relevant_resource_types = json.loads(strip_json_decoration(response_str))
146
+ except json.JSONDecodeError:
147
+ print(
148
+ "Could not decode JSON response for relevant resource types:"
149
+ f" {response_str}"
150
+ )
151
+ relevant_resource_types = []
152
+ print(
153
+ "Relevant Resource Types Identified:"
154
+ f" {', '.join(relevant_resource_types)}"
155
+ )
156
+ return {
157
+ "relevant_resource_types": relevant_resource_types,
158
+ "sdt_idx": 0,
159
+ "tool_calls_to_execute": [],
160
+ }
161
+
162
+ def announce_sdt_node(state):
163
+ sdt_idx = state["sdt_idx"]
164
+ relevant_resource_types = state.get("relevant_resource_types", [])
165
+ resource_type = relevant_resource_types[sdt_idx]
166
+ manifest = state.get("patient_fhir_manifest", {})
167
+ resource_manifest = manifest.get(resource_type, [])
168
+ print(f"Announcing data selection for {resource_type}")
169
+ return {
170
+ "resource_type_to_process": resource_type,
171
+ "resource_manifest_codes": resource_manifest,
172
+ }
173
+
174
+ def select_data_to_retrieve(state):
175
+ """Uses the manifest and relevant resource types to determine which FHIR resources to retrieve."""
176
+ sdt_idx = state["sdt_idx"]
177
+ manifest = state.get("patient_fhir_manifest", {})
178
+ relevant_resource_types = state.get("relevant_resource_types", [])
179
+ tools_string = render_text_description([get_patient_fhir_resource])
180
+
181
+ resource_type = relevant_resource_types[sdt_idx]
182
+ print(f"Data Selection for {resource_type}")
183
+
184
+ if resource_type not in manifest:
185
+ print(f"No data found for {resource_type} in the manifest.")
186
+ return {"sdt_idx": sdt_idx + 1, "resource_type_processed": resource_type}
187
+
188
+ manifest_content = f"**{resource_type}**: "
189
+ if len(manifest.get(resource_type, [])) > 0:
190
+ manifest_content += (
191
+ f"Available codes include: {', '.join(manifest[resource_type])}\\n"
192
+ )
193
+ else:
194
+ manifest_content += "Present (no specific codes found)\\n"
195
+ prompt = (
196
+ "SYSTEM INSTRUCTION: think silently if needed.\\n"
197
+ + "FOR CONTEXT ONLY, USER QUESTION:"
198
+ f" {state['messages'][1].content}\\n\\n"
199
+ + f"PATIENT DATA MANIFEST: {manifest_content}\\n\\n"
200
+ + "You are a specialized API request generator. Your SOLE task is to"
201
+ " output a JSON of a tool call to gather the necessary information"
202
+ " to answer the user's question. Respond with ONLY a JSON, no"
203
+ " explanations or prose.\\n"
204
+ + f"Your available tool is:\\n{tools_string}\\n\\n"
205
+ + f"**At this stage you can only call {resource_type}.**\\n"
206
+ + "EXAMPLE:\\n"
207
+ + '{\\"name\\": \\"get_patient_fhir_resource\\", \\"args\\":'
208
+ ' {\\"patient_id\\": \\"some-patient-id\\",'
209
+ ' \\"fhir_resource\\": \\"'
210
+ + resource_type
211
+ + '\\", \\"filter_code\\": \\"csv-codes-from-manifest\\"}}'
212
+ )
213
+ print(
214
+ f"--- select_data_to_retrieve PROMPT ({resource_type})"
215
+ f" ---\n{prompt}\n------------------------------------------"
216
+ )
217
+ response_str = llm.invoke(prompt, **_LLM_INVOKE_ARGS)
218
+ print(
219
+ f"--- select_data_to_retrieve RESPONSE ({resource_type})"
220
+ f" ---\n{response_str}\n-------------------------------------------"
221
+ )
222
+ try:
223
+ tool_call = json.loads(strip_json_decoration(response_str))
224
+ tool_call["args"]["fhir_store_url"] = fhir_store_url
225
+ return {
226
+ "tool_calls_to_execute": [{**tool_call, "id": resource_type}],
227
+ "sdt_idx": sdt_idx + 1,
228
+ "resource_type_processed": resource_type,
229
+ }
230
+ except json.JSONDecodeError:
231
+ print(
232
+ f"Could not decode JSON response for {resource_type}: {response_str}"
233
+ )
234
+ # If we fail to decode, we just skip this resource type.
235
+ return {"sdt_idx": sdt_idx + 1, "resource_type_processed": resource_type}
236
+
237
+ def sdt_conditional_edge(state):
238
+ if state["sdt_idx"] < len(state["relevant_resource_types"]):
239
+ return "announce_sdt"
240
+ return "init_edr_idx"
241
+
242
+ def init_edr_idx_node(state):
243
+ return {"edr_idx": 0}
244
+
245
+ def init_edr_conditional_edge(state):
246
+ if state["tool_calls_to_execute"]:
247
+ return "announce_retrieval"
248
+ return "final_answer"
249
+
250
+ def announce_retrieval_node(state):
251
+ edr_idx = state["edr_idx"]
252
+ tool_calls = state.get("tool_calls_to_execute", [])
253
+ tool_call = tool_calls[edr_idx]
254
+ resource_type = tool_call.get("id", "unknown_resource")
255
+ print(f"Announcing retrieval of {resource_type}")
256
+ return {"resource_type_to_retrieve": resource_type}
257
+
258
+ def execute_data_retrieval(state):
259
+ """Executes the planned tool calls and summarizes the output."""
260
+ edr_idx = state["edr_idx"]
261
+ tool_calls = state.get("tool_calls_to_execute", [])
262
+ tool_call = tool_calls[edr_idx]
263
+ resource_type = tool_call.get("id", "unknown_resource")
264
+ print(f"Fetching FHIR data for {resource_type}")
265
+ tool_output_list = data_retrieval_tool_node.invoke(
266
+ [AIMessage(content="", tool_calls=[tool_call])]
267
+ )
268
+ if not tool_output_list:
269
+ print(f"No tool output received for {resource_type}")
270
+ return {
271
+ "resource_type_retrieved": resource_type,
272
+ "summary_generated": False,
273
+ "fhir_tool_output": "",
274
+ }
275
+
276
+ tool_output = tool_output_list[0].content
277
+ return {
278
+ "resource_type_retrieved": resource_type,
279
+ "summary_generated": True,
280
+ "fhir_tool_output": tool_output,
281
+ }
282
+
283
+ def announce_summarization_node(state):
284
+ resource_type = state["resource_type_retrieved"]
285
+ print(f"Announcing summarization of {resource_type}")
286
+ return {"resource_being_summarized": resource_type}
287
+
288
+ def summarize_node(state: AgentState) -> dict:
289
+ if not state["summary_generated"]:
290
+ return {"edr_idx": state["edr_idx"] + 1}
291
+
292
+ resource_type = state["resource_type_retrieved"]
293
+ tool_output = state["fhir_tool_output"]
294
+ concise_facts_prompt = (
295
+ "SYSTEM INSTRUCTION: think silently if needed.\\nFOR CONTEXT ONLY,"
296
+ f" USER QUESTION: {state['messages'][1].content}\\n\\nTOOL"
297
+ f" OUTPUT:\\n{tool_output}\\n\\nYou are a fact summarizing agent."
298
+ " Your output will be used to answer the USER QUESTION.\\nCollect"
299
+ " from the 'TOOL OUTPUT' facts ONLY if it is relevant to answer the"
300
+ " USER QUESTION.\\nWrite a very concise English summary, only facts"
301
+ " relevant to the user question. DO NOT OUTPUT JSON.\\nYou are not"
302
+ " authorized to answer the user question. Do not provide any output"
303
+ " beyond concise facts. Filter out any facts which are not helpful"
304
+ " for the user question. Include date or date ranges. Only for the"
305
+ " most critical facts, include FHIR record references [record"
306
+ " type/record id]. For repeating multiple times provide summarize"
307
+ " and provide only a single reference and date range."
308
+ )
309
+ print(
310
+ f"--- summarize_node PROMPT ({resource_type})"
311
+ f" ---\n{concise_facts_prompt}\n------------------------------------------"
312
+ )
313
+ current_summary = llm.invoke(concise_facts_prompt, **_LLM_INVOKE_ARGS)
314
+ print(
315
+ f"--- summarize_node RESPONSE ({resource_type})"
316
+ f" ---\n{current_summary}\n-------------------------------------------"
317
+ )
318
+ return {
319
+ "tool_output_summary": [exclude_thinking_component(current_summary)],
320
+ "edr_idx": state["edr_idx"] + 1,
321
+ "resource_type_retrieved": resource_type,
322
+ }
323
+
324
+ def should_summarize_edge(state):
325
+ if state["summary_generated"]:
326
+ return "announce_summarization"
327
+ return "summarize_node"
328
+
329
+ def edr_conditional_edge(state):
330
+ if state["edr_idx"] < len(state["tool_calls_to_execute"]):
331
+ return "announce_retrieval"
332
+ return "final_answer"
333
+
334
+ def get_final_answer(state):
335
+ """If we have enough data, this node generates the final answer."""
336
+ summary = "\\n\\n".join(state["tool_output_summary"])
337
+ prompt = (
338
+ "Synthesize all information from the 'SUMMARIZED INFORMATION' to"
339
+ " provide a comprehensive final answer. Preserve relevant FHIR"
340
+ " references.\\n\\nUSER QUESTION:"
341
+ f" {state['messages'][1].content}\\n\\nSUMMARIZED INFORMATION:"
342
+ f" {summary}\\n\\nFinal Answer using markdown:"
343
+ )
344
+ print(
345
+ "--- get_final_answer PROMPT"
346
+ f" ---\n{prompt}\n------------------------------------------"
347
+ )
348
+ response = llm.invoke(prompt, **_LLM_INVOKE_ARGS)
349
+ print(
350
+ "--- get_final_answer RESPONSE"
351
+ f" ---\n{response}\n-------------------------------------------"
352
+ )
353
+ return {"messages": [AIMessage(content=response)]}
354
+
355
+ workflow = StateGraph(AgentState)
356
+ workflow.add_node(
357
+ "generate_manifest_tool_call", generate_manifest_tool_call_node
358
+ )
359
+ workflow.add_node(
360
+ "execute_manifest_tool_call", execute_manifest_tool_call_node
361
+ )
362
+ workflow.add_node(
363
+ "identify_relevant_resource_types", identify_relevant_resource_types
364
+ )
365
+ workflow.add_node("announce_sdt", announce_sdt_node)
366
+ workflow.add_node("select_data_to_retrieve", select_data_to_retrieve)
367
+ workflow.add_node("init_edr_idx", init_edr_idx_node)
368
+ workflow.add_node("announce_retrieval", announce_retrieval_node)
369
+ workflow.add_node("execute_data_retrieval", execute_data_retrieval)
370
+ workflow.add_node("announce_summarization", announce_summarization_node)
371
+ workflow.add_node("summarize_node", summarize_node)
372
+ workflow.add_node("final_answer", get_final_answer)
373
+ workflow.set_entry_point("generate_manifest_tool_call")
374
+ workflow.add_edge("generate_manifest_tool_call", "execute_manifest_tool_call")
375
+ workflow.add_edge(
376
+ "execute_manifest_tool_call", "identify_relevant_resource_types"
377
+ )
378
+ workflow.add_edge(
379
+ "identify_relevant_resource_types", "announce_sdt"
380
+ )
381
+ workflow.add_edge("announce_sdt", "select_data_to_retrieve")
382
+ workflow.add_conditional_edges(
383
+ "select_data_to_retrieve",
384
+ sdt_conditional_edge,
385
+ {
386
+ "announce_sdt": "announce_sdt",
387
+ "init_edr_idx": "init_edr_idx",
388
+ },
389
+ )
390
+ workflow.add_conditional_edges(
391
+ "init_edr_idx",
392
+ init_edr_conditional_edge,
393
+ {
394
+ "announce_retrieval": "announce_retrieval",
395
+ "final_answer": "final_answer",
396
+ },
397
+ )
398
+ workflow.add_edge("announce_retrieval", "execute_data_retrieval")
399
+ workflow.add_conditional_edges(
400
+ "execute_data_retrieval",
401
+ should_summarize_edge,
402
+ {
403
+ "announce_summarization": "announce_summarization",
404
+ "summarize_node": "summarize_node",
405
+ },
406
+ )
407
+ workflow.add_edge("announce_summarization", "summarize_node")
408
+ workflow.add_conditional_edges(
409
+ "summarize_node",
410
+ edr_conditional_edge,
411
+ {
412
+ "announce_retrieval": "announce_retrieval",
413
+ "final_answer": "final_answer",
414
+ },
415
+ )
416
+ workflow.add_edge("final_answer", END)
417
+ return workflow.compile()
modules/tools.py ADDED
@@ -0,0 +1,205 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import json
2
+ from typing import Optional
3
+ from urllib.parse import quote
4
+ import hashlib
5
+ import sqlite3
6
+
7
+ from google.auth import default as get_auth_default
8
+ import google.auth.transport.requests as google_auth_requests
9
+ from langchain_core.tools import tool
10
+ import requests
11
+
12
+ fhir_resource_types = [
13
+ "Encounter",
14
+ "Practitioner",
15
+ "Condition",
16
+ "Observation",
17
+ "AllergyIntolerance",
18
+ "FamilyMemberHistory",
19
+ "MedicationRequest",
20
+ "MedicationStatement",
21
+ "MedicationAdministration",
22
+ "DiagnosticReport",
23
+ "Procedure",
24
+ "ServiceRequest",
25
+ ]
26
+
27
+
28
+ SCOPES = [
29
+ "https://www.googleapis.com/auth/cloud-platform",
30
+ "https://www.googleapis.com/auth/cloud-healthcare",
31
+ ]
32
+
33
+
34
+ FHIR_CACHE_DB = "fhir_cache.db"
35
+
36
+ def _init_fhir_cache():
37
+ conn = sqlite3.connect(FHIR_CACHE_DB)
38
+ cursor = conn.cursor()
39
+ cursor.execute("""
40
+ CREATE TABLE IF NOT EXISTS fhir_cache (
41
+ key TEXT PRIMARY KEY,
42
+ value TEXT
43
+ )
44
+ """)
45
+ conn.commit()
46
+ conn.close()
47
+
48
+ _init_fhir_cache()
49
+
50
+
51
+ def _get_fhir_resource(resource_path: str, fhir_store_url: str) -> dict:
52
+ """Helper function to make an authenticated GET request to the FHIR store, with pagination and compaction."""
53
+ cache_key = hashlib.md5(f"{resource_path}:{fhir_store_url}".encode()).hexdigest()
54
+ conn = sqlite3.connect(FHIR_CACHE_DB)
55
+ cursor = conn.cursor()
56
+ try:
57
+ cursor.execute("SELECT value FROM fhir_cache WHERE key = ?", (cache_key,))
58
+ result = cursor.fetchone()
59
+ if result:
60
+ print(f"...[Tool] Cache hit for: {fhir_store_url}/{resource_path}")
61
+ return json.loads(result[0])
62
+ except Exception as e:
63
+ print(f"...[Tool] Cache read error: {str(e)}")
64
+ # If cache read fails, we proceed without cache
65
+
66
+ try:
67
+ credentials, _ = get_auth_default(scopes=SCOPES)
68
+ request = google_auth_requests.Request()
69
+ credentials.refresh(request)
70
+ headers = {"Authorization": f"Bearer {credentials.token}"}
71
+
72
+ all_entries = []
73
+ url = f"{fhir_store_url}/{resource_path}"
74
+
75
+ while url:
76
+ print(f"...[Tool] Making request to: {url}")
77
+ response = requests.get(url, headers=headers)
78
+ response.raise_for_status()
79
+ current_page = response.json()
80
+
81
+ if "entry" in current_page:
82
+ all_entries.extend(current_page["entry"])
83
+
84
+ url = None # Reset url for each iteration
85
+ for link in current_page.get("link", []):
86
+ if link.get("relation") == "next":
87
+ url = link.get("url")
88
+ break
89
+
90
+ # Reconstruct the bundle with all entries
91
+ data = {
92
+ "resourceType": "Bundle",
93
+ "type": "searchset",
94
+ "total": len(all_entries),
95
+ "entry": all_entries,
96
+ }
97
+
98
+ def clean(obj):
99
+ # Remove .resource.meta (timestamps/versions) from all objects
100
+ if isinstance(obj, list):
101
+ return [clean(i) for i in obj]
102
+ if isinstance(obj, dict):
103
+ return {k: clean(v) for k, v in obj.items() if k != "meta"}
104
+ return (
105
+ obj.split("/fhir/")[-1]
106
+ if isinstance(obj, str) and "/fhir/" in obj
107
+ else obj
108
+ )
109
+
110
+ # [OPTIONAL] Strip technical metadata and shorten URLs
111
+ for e in all_entries:
112
+ e.pop("fullUrl", None)
113
+ e.pop("search", None)
114
+ if "resource" in e:
115
+ e["resource"] = clean(e["resource"])
116
+
117
+ try:
118
+ cursor.execute(
119
+ "INSERT INTO fhir_cache (key, value) VALUES (?, ?)",
120
+ (cache_key, json.dumps(data)),
121
+ )
122
+ conn.commit()
123
+ except Exception as e:
124
+ print(f"...[Tool] Cache write error: {str(e)}")
125
+ return data
126
+ except Exception as e:
127
+ print(f"...[Tool] Error: {str(e)}")
128
+ return {"error": f"An error occurred: {str(e)}"}
129
+ finally:
130
+ conn.close()
131
+
132
+
133
+ @tool
134
+ def get_patient_fhir_resource(
135
+ patient_id: str,
136
+ fhir_resource: str,
137
+ fhir_store_url: str,
138
+ filter_code: Optional[str] = None,
139
+ ) -> str:
140
+ """Gets a list of FHIR resources for a single patient.
141
+
142
+ patient_id: The ID of the patient. fhir_resource: The FHIR resource type to
143
+ retrieve (Observation, Condition, MedicationRequest, etc.) fhir_store_url: The
144
+ URL of the FHIR store. filter_code: A comma seperated list of code filter to
145
+ apply to the resource (34117-2, 171207006, 82667-7, 8867-4, etc)
146
+ """
147
+ resource_path = f"{fhir_resource}?patient=Patient/{patient_id}"
148
+ if filter_code:
149
+ resource_path += f"&code={quote(filter_code.replace(' ', ''))}"
150
+ if "Medication" in fhir_resource:
151
+ resource_path += f"&_include={fhir_resource}:medication"
152
+
153
+ content = _get_fhir_resource(resource_path, fhir_store_url)
154
+
155
+ # If the initial call with code:text returns no results, try with category:text
156
+ if content.get("total", 0) == 0 and filter_code:
157
+ print(
158
+ "...[Tool] No results found with 'code:text'. Retrying with"
159
+ " 'category:text'..."
160
+ )
161
+ resource_path = f"{fhir_resource}?patient=Patient/{patient_id}&category={quote(filter_code)}"
162
+ content = _get_fhir_resource(resource_path, fhir_store_url)
163
+
164
+ print(
165
+ f"...[Tool] Returning {len(content.get('entry', []))} results for"
166
+ f" {fhir_resource}"
167
+ )
168
+ return json.dumps(content)
169
+
170
+
171
+ @tool
172
+ def get_patient_data_manifest(patient_id: str, fhir_store_url: str) -> str:
173
+ """Gets a manifest of all available FHIR resources and their codes for a patient by
174
+
175
+ querying the patient's entire record. Use this tool first to discover what
176
+ data is available.
177
+ """
178
+ manifest = {}
179
+
180
+ for resource_type in fhir_resource_types:
181
+ resource_path = f"{resource_type}?patient=Patient/{patient_id}"
182
+ print(
183
+ f"...[Tool] Discovering all available {resource_type} resources for"
184
+ f" patient: {patient_id}"
185
+ )
186
+ resources_json = _get_fhir_resource(resource_path, fhir_store_url)
187
+
188
+ if isinstance(resources_json, dict) and resources_json.get("total", 0) > 0:
189
+ for entry in resources_json.get("entry", []):
190
+ resource = entry.get("resource", {})
191
+ if resource_type not in manifest:
192
+ manifest[resource_type] = []
193
+
194
+ if "code" in resource and "coding" in resource["code"]:
195
+ for code in resource.get("code").get("coding", []):
196
+ manifest[resource_type].append(
197
+ f'{code.get("display", "")}={code.get("code", "")}'
198
+ )
199
+ else:
200
+ print(
201
+ f"...[Tool] No {resource_type} resources found for patient:"
202
+ f" {patient_id}"
203
+ )
204
+
205
+ return json.dumps(manifest)
requirements.txt ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ flask
2
+ gunicorn
3
+ langchain-google-vertexai
4
+ langchain
5
+ langchain-core
6
+ langchain-community
7
+ google-api-python-client
8
+ requests
9
+ google-cloud-aiplatform
10
+ langgraph
11
+ pyarrow
run_local.sh ADDED
@@ -0,0 +1,45 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/bin/bash
2
+ set -e
3
+
4
+ # Check if docker-buildx-plugin is installed.
5
+ if ! dpkg -s docker-buildx-plugin >/dev/null 2>&1; then
6
+ echo "docker-buildx-plugin is not installed. Installing..."
7
+ sudo apt-get update && sudo apt-get install -y docker-buildx-plugin
8
+ fi
9
+
10
+ IMAGE_NAME="agentic-medgemma-fhir:latest"
11
+ CONTAINER_NAME="agentic-medgemma-fhir"
12
+
13
+ # Function to stop the container
14
+ stop() {
15
+ echo "Stopping container: $CONTAINER_NAME..."
16
+ # Check if the container is running before trying to stop it
17
+ if [ "$(docker ps -q -f name=^${CONTAINER_NAME}$)" ]; then
18
+ docker stop "$CONTAINER_NAME"
19
+ echo "✅ Container stopped."
20
+ else
21
+ echo "Container '$CONTAINER_NAME' is not running."
22
+ fi
23
+ }
24
+
25
+ echo "Building Docker image: $IMAGE_NAME"
26
+ DOCKER_BUILDKIT=1 docker build -t "$IMAGE_NAME" .
27
+
28
+ stop
29
+ docker rm "${CONTAINER_NAME}" 2>/dev/null || true
30
+
31
+ echo "Killing any process on port 8080..."
32
+ fuser -k 8080/tcp || true
33
+
34
+ ENV_FILE=~/agentic_medgemma_fhir.env
35
+
36
+ if [ ! -f "$ENV_FILE" ]; then
37
+ echo "Error: Environment file not found at $ENV_FILE"
38
+ exit 1
39
+ fi
40
+
41
+ echo "Running Docker image: $IMAGE_NAME"
42
+ echo "Access the viewer at http://localhost:8080"
43
+ docker run --rm --name "${CONTAINER_NAME}" -p 8080:8080 \
44
+ --env-file ~/agentic_medgemma_fhir.env \
45
+ "$IMAGE_NAME"
static/agent.avif ADDED
static/agent.svg ADDED
static/arrow.js ADDED
@@ -0,0 +1,130 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ const {useEffect, useRef} = React;
2
+
3
+ function ArrowFlow({direction = 'rtl', reverseFlow = false}) {
4
+ const stemPathRef = useRef(null);
5
+ const circlePathRef = useRef(null);
6
+ const arrowRef = useRef(null);
7
+ const circleRefs = useRef([]);
8
+ circleRefs.current = [];
9
+
10
+ const circleRadii = [4, 14, 7, 16, 9];
11
+
12
+ const stemPaths = {
13
+ 'rtl': 'M 250,75 L 125,75 A 50 50 0 0 0 75,125 L 75,132',
14
+ 'ltr': 'M 0,75 L 125,75 A 50 50 0 0 1 175,125 L 175,132',
15
+ };
16
+ const circlePaths = {
17
+ 'rtl': 'M 300,75 L 125,75 A 50 50 0 0 0 75,125 L 75,182',
18
+ 'ltr': 'M -50,75 L 125,75 A 50 50 0 0 1 175,125 L 175,182',
19
+ };
20
+
21
+ const stemPathD = stemPaths[direction];
22
+ const circlePathD = circlePaths[direction];
23
+
24
+ const addToCircleRefs = el => {
25
+ if (el && !circleRefs.current.includes(el)) {
26
+ circleRefs.current.push(el);
27
+ }
28
+ };
29
+
30
+ useEffect(() => {
31
+ gsap.registerPlugin(MotionPathPlugin);
32
+ gsap.set(arrowRef.current, {opacity: 0});
33
+ gsap.set(circleRefs.current, {opacity: 0});
34
+
35
+ const len = stemPathRef.current.getTotalLength();
36
+ let pathTween, arrowTween, circlesTl;
37
+
38
+ if (reverseFlow) {
39
+ gsap.set(
40
+ stemPathRef.current, {strokeDasharray: len, strokeDashoffset: -len});
41
+ pathTween = gsap.to(
42
+ stemPathRef.current,
43
+ {strokeDashoffset: 0, duration: 1.6, delay: 0, ease: 'power1.inOut'});
44
+ } else {
45
+ gsap.set(
46
+ stemPathRef.current, {strokeDasharray: len, strokeDashoffset: len});
47
+ pathTween = gsap.to(
48
+ stemPathRef.current,
49
+ {strokeDashoffset: 0, duration: 1.6, delay: 0, ease: 'power1.inOut'});
50
+ }
51
+
52
+ const motionPathConfig = {
53
+ path: stemPathRef.current,
54
+ align: stemPathRef.current,
55
+ alignOrigin: [0.5, 0.5],
56
+ autoRotate: true,
57
+ start: reverseFlow ? 1 : 0,
58
+ end: reverseFlow ? 0 : 1,
59
+ };
60
+
61
+ arrowTween = gsap.to(arrowRef.current, {
62
+ opacity: 1,
63
+ duration: 1.6,
64
+ delay: 0,
65
+ motionPath: motionPathConfig,
66
+ ease: 'power1.inOut',
67
+ });
68
+
69
+ const circleMotionPathConfig = {
70
+ path: circlePathRef.current,
71
+ align: circlePathRef.current,
72
+ alignOrigin: [0.5, 0.5],
73
+ start: reverseFlow ? 1 : 0,
74
+ end: reverseFlow ? 0 : 1,
75
+ };
76
+
77
+ circlesTl = gsap.timeline({delay: 0});
78
+ circleRefs.current.forEach((circle, index) => {
79
+ circlesTl.fromTo(
80
+ circle, {opacity: 0}, {opacity: 1, duration: 0.2},
81
+ index * 0.3); // stagger start time
82
+ circlesTl.to(
83
+ circle, {
84
+ motionPath: circleMotionPathConfig,
85
+ duration: 1.6,
86
+ ease: 'power1.inOut',
87
+ },
88
+ index * 0.3); // same start time as opacity fade in
89
+ circlesTl.to(
90
+ circle, {opacity: 0, duration: 0.2},
91
+ index * 0.3 + 1.4); // fade out before end
92
+ });
93
+ circlesTl.to(arrowRef.current, {opacity: 0, duration: 0.2}, 2.8);
94
+ circlesTl.to(stemPathRef.current, {opacity: 0, duration: 0.2}, 2.8);
95
+
96
+ return () => {
97
+ pathTween.kill();
98
+ arrowTween.kill();
99
+ circlesTl.kill();
100
+ }
101
+ }, [direction, reverseFlow]);
102
+
103
+ return (
104
+ <div>
105
+ <svg width='250px' height='150px' viewBox='0 0 250 150' version='1.1' style={{
106
+ overflow: 'visible' }}>
107
+ <defs>
108
+ <filter id='glow' x='-100%' y='-100%' width='300%' height='300%'>
109
+ <feGaussianBlur in='SourceAlpha' stdDeviation='3' result='blur'/>
110
+ <feFlood floodColor='#4285F4' floodOpacity='1' result='color'/>
111
+ <feComposite in='color' in2='blur' operator='in' result='glow'/>
112
+ <feMerge>
113
+ <feMergeNode in='glow'/>
114
+ <feMergeNode in='SourceGraphic'/>
115
+ </feMerge>
116
+ </filter>
117
+ </defs>
118
+ <g id="Page-1" stroke="none" strokeWidth="1" fill="none" fillRule="evenodd">
119
+ <path ref={stemPathRef} id="Path-1" className="path" fill="none" stroke="#484135" strokeWidth="8" strokeLinejoin="round" strokeMiterlimit="10" d={stemPathD} />
120
+ <path ref={circlePathRef} id='Path-Circles' stroke='none' d={
121
+ circlePathD} />
122
+ {[...Array(5)].map((_, i) => (
123
+ <circle key={i} ref={addToCircleRefs} r={circleRadii[i]} fill="#4285F4" filter="url(#glow)" />
124
+ ))
125
+ }
126
+ <polyline ref = {arrowRef} id = 'arrow' points = '0,-9 18,0 0,9 5,0' fill =
127
+ '#484135' /></g>
128
+ </svg>< /div>
129
+ );
130
+ }
static/background.jpg ADDED

Git LFS Details

  • SHA256: dc39bda40639d1f2a250ea4e93697345272aef558a9addfbae6f7d07c92a248b
  • Pointer size: 131 Bytes
  • Size of remote file: 194 kB
static/clinician.avif ADDED
static/dialog.js ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ function Dialog({onClose}) {
2
+ return (
3
+ <div className="popup-overlay" onClick={onClose}>
4
+ <div className="popup-content" onClick={e => e.stopPropagation()}>
5
+ <button className="popup-close-button" onClick={onClose}>×</button>
6
+ <h2 id="dialog-title" className="dialog-title-text">Details About This Demo</h2>
7
+ <p>See how this demo works by inspecting the <a href="https://github.com/Google-Health/medgemma/tree/main/notebooks/ehr_navigator_agent.ipynb" target="_blank" rel="noopener noreferrer">colab version <img className="hf-logo" src="https://upload.wikimedia.org/wikipedia/commons/d/d0/Google_Colaboratory_SVG_Logo.svg" /></a> of it. The data used in this demo is synthetic and generated by <a href="https://github.com/synthetichealth/synthea" target="_blank" rel="noopener noreferrer">Synthea</a>. You can access the data via this <a href="https://console.cloud.google.com/healthcare/fhirviewer/us-central1/public/fhirStores/synthetic-patients/browse?project=hai-cd3-foundations" target="_blank" rel="noopener noreferrer">this FHIR store</a>.</p>
8
+ <p><b>The Model:</b> This demo features Google's MedGemma-27B, a Gemma-based model fine-tuned for comprehending medical text. It demonstrates MedGemma's ability to accelerate the development of AI-powered healthcare applications by offering advanced interpretation of medical data.</p>
9
+ <p><b>Accessing and Using the Model:</b> Google's MedGemma-27B is available on <a href="https://huggingface.co/collections/google/medgemma-release" target="_blank" rel="noopener noreferrer">HuggingFace<img className="hf-logo" src="https://huggingface.co/datasets/huggingface/brand-assets/resolve/main/hf-logo.svg" /></a> and is easily deployable via&nbsp;<a href="https://console.cloud.google.com/vertex-ai/publishers/google/model-garden/medgemma" target="_blank" rel="noopener noreferrer">Model Garden <img className="hf-logo" src="https://www.gstatic.com/cloud/images/icons/apple-icon.png" /></a>. Learn more about using the model and its limitations on the <a href="https://developers.google.com/health-ai-developer-foundations?referral=appoint-ready" target="_blank" rel="noopener noreferrer">HAI-DEF developer site</a>.</p>
10
+ <p><b>Like this Demo:</b> If you find this demonstration valuable, kindly like it on HuggingFace.</p>
11
+ <p><b>Explore More Demos:</b> Discover additional demonstrations on HuggingFace Spaces or via Colabs:</p>
12
+ <ul>
13
+ <li><a href="https://huggingface.co/collections/google/hai-def-concept-apps-6837acfccce400abe6ec26c1" target="_blank" rel="noopener noreferrer">Collection of concept apps <img className="hf-logo" src="https://huggingface.co/datasets/huggingface/brand-assets/resolve/main/hf-logo.svg" /></a> built around HAI-DEF open models to inspire the community.</li>
14
+ <li><a href="https://github.com/Google-Health/medgemma/tree/main/notebooks/fine_tune_with_hugging_face.ipynb" target="_blank" rel="noopener noreferrer">Finetune MedGemma Colab <img className="hf-logo" src="https://upload.wikimedia.org/wikipedia/commons/d/d0/Google_Colaboratory_SVG_Logo.svg" /></a>- See an example of how to fine-tune this model.</li>
15
+ </ul>
16
+ <p>For more technical details about this demo, please refer to the <a href=" https://huggingface.co/spaces/google/ehr-navigator-agent-with-medgemma/blob/main/README.md#table-of-contents" target="_blank" rel="noopener noreferrer">README</a> file in the repository.</p>
17
+ <div className="popup-footer">
18
+ <button className="popup-button" onClick={onClose}>Close</button>
19
+ </div>
20
+ </div>
21
+ </div>
22
+ );
23
+ }
static/fhir-colors.svg ADDED
static/fhir.svg ADDED
static/intro.svg ADDED
static/medgemma.avif ADDED
static/script.js ADDED
@@ -0,0 +1,258 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ const {useState, useEffect, useRef, useCallback} = React;
2
+
3
+ function Markdown({ content }) {
4
+ const md = window.markdownit();
5
+ const html = md.render(content);
6
+ return <div dangerouslySetInnerHTML={{ __html: html }} />;
7
+ }
8
+
9
+ function AgentRunner({onBack}) {
10
+ const [questions, setQuestions] = useState([]);
11
+ const [selectedQuestionId, setSelectedQuestionId] = useState(null);
12
+ const [output, setOutput] = useState([]);
13
+ const [answer, setAnswer] = useState('');
14
+ const [intermediateData, setIntermediateData] = useState('');
15
+ const [isRunning, setIsRunning] = useState(false);
16
+ const [isFhirFlowReversed, setIsFhirFlowReversed] = useState(false);
17
+ const [isVertexFlowReversed, setIsVertexFlowReversed] = useState(false);
18
+ const [isFhirFlowActive, setIsFhirFlowActive] = useState(false);
19
+ const [isVertexFlowActive, setIsVertexFlowActive] = useState(false);
20
+ const [isVertexWorking, setIsVertexWorking] = useState(false);
21
+ const [showDetailsDialog, setShowDetailsDialog] = useState(false);
22
+
23
+ useEffect(() => {
24
+ fetch('/questions').then(response => response.json()).then(data => {
25
+ setQuestions(data);
26
+ });
27
+ }, []);
28
+
29
+ const eventQueue = useRef([]);
30
+ const isProcessingQueue = useRef(false);
31
+ const eventSourceRef = useRef(null);
32
+ const timeoutRef = useRef(null);
33
+
34
+ const eventIdCounter = useRef(0);
35
+ const addEvent = (eventText) => {
36
+ const event = {id: eventIdCounter.current++, text: eventText};
37
+ setOutput(prevOutput => [event, ...prevOutput]);
38
+ };
39
+
40
+ const processMessageQueue = useCallback(() => {
41
+ if (eventQueue.current.length === 0) {
42
+ isProcessingQueue.current = false;
43
+ return;
44
+ }
45
+
46
+ isProcessingQueue.current = true;
47
+ const event = eventQueue.current.shift();
48
+ const message = JSON.parse(event.data);
49
+ const timeout = message.destination === 'LLM' && message.request ? 6000 : 3000;
50
+
51
+ if (message.destination === 'FHIR') {
52
+ setIsFhirFlowReversed(!message.request);
53
+ setIsFhirFlowActive(true);
54
+ setIsVertexFlowActive(false);
55
+ setIsVertexWorking(false);
56
+ } else if (message.destination === 'LLM') {
57
+ setIsVertexFlowReversed(!message.request);
58
+ setIsVertexFlowActive(true);
59
+ setIsFhirFlowActive(false);
60
+ setIsVertexWorking(message.request);
61
+ } else {
62
+ setIsFhirFlowActive(false);
63
+ setIsVertexFlowActive(false);
64
+ setIsVertexWorking(false);
65
+ }
66
+
67
+ if (message.final) {
68
+ addEvent(message.event);
69
+ setAnswer(message.data);
70
+ setIntermediateData('');
71
+ eventSourceRef.current.close();
72
+ setIsRunning(false);
73
+ setIsVertexWorking(false);
74
+ // If queue still has items, process them, otherwise stop.
75
+ if (eventQueue.current.length > 0) {
76
+ timeoutRef.current = setTimeout(processMessageQueue, timeout);
77
+ } else {
78
+ isProcessingQueue.current = false;
79
+ if (timeoutRef.current) clearTimeout(timeoutRef.current);
80
+ }
81
+ return;
82
+ }
83
+
84
+ // Not final message
85
+ if (message.data && typeof message.data !== 'string') {
86
+ setIntermediateData(
87
+ '```json\n' + JSON.stringify(message.data, null, 2) + '\n```');
88
+ addEvent(`${message.event}: Received structured data.`);
89
+ } else {
90
+ setIntermediateData(message.data || '');
91
+ let dataSuffix = message.data && message.data.length < 30 ? `: ${message.data}` : '';
92
+ addEvent(`${message.event}${dataSuffix}`);
93
+ }
94
+
95
+ timeoutRef.current = setTimeout(processMessageQueue, timeout);
96
+ }, []);
97
+
98
+ const runAgent = (questionId) => {
99
+ setSelectedQuestionId(questionId);
100
+ setOutput([]);
101
+ setAnswer('');
102
+ setIntermediateData('');
103
+ setIsRunning(true);
104
+ setIsFhirFlowActive(false);
105
+ setIsVertexFlowActive(false);
106
+ setIsVertexWorking(false);
107
+ eventQueue.current = [];
108
+ if (timeoutRef.current) clearTimeout(timeoutRef.current);
109
+ isProcessingQueue.current = false;
110
+
111
+ eventSourceRef.current =
112
+ new EventSource(`/run_agent?question_id=${questionId}`);
113
+
114
+ eventSourceRef.current.onmessage = event => {
115
+ eventQueue.current.push(event);
116
+ if (!isProcessingQueue.current) {
117
+ processMessageQueue();
118
+ }
119
+ };
120
+
121
+ eventSourceRef.current.onerror = () => {
122
+ const hasFinalMessage = eventQueue.current.some(event => JSON.parse(event.data).final);
123
+ if (hasFinalMessage) {
124
+ eventSourceRef.current.close();
125
+ if (!isProcessingQueue.current && eventQueue.current.length > 0) {
126
+ processMessageQueue();
127
+ }
128
+ return;
129
+ }
130
+
131
+ addEvent('Error connecting to the agent.');
132
+ eventSourceRef.current.close();
133
+ setIsRunning(false);
134
+ eventQueue.current = [];
135
+ if (timeoutRef.current) clearTimeout(timeoutRef.current);
136
+ isProcessingQueue.current = false;
137
+ setIsVertexWorking(false);
138
+ };
139
+ };
140
+
141
+ return (
142
+ <div className='agent-page-container'>
143
+ {showDetailsDialog && <Dialog onClose={() => setShowDetailsDialog(false)} />}
144
+ <div className='demo-header'>
145
+ <button className='back-button' onClick={onBack}>&lt; Back</button>
146
+ <button className="details-button" onClick={() => setShowDetailsDialog(true)}>
147
+ <span className="material-symbols-outlined">
148
+ code
149
+ </span>
150
+ Details about this Demo
151
+ </button>
152
+ </div>
153
+ <div className='demo-frame'>
154
+ <div className='agent-container'>
155
+ <div className='left-panel'>
156
+ <h4>Clinician</h4>
157
+ <img src='static/clinician.avif' alt='Clinician' className='clinician-image' />
158
+ <h5>Select a Task</h5>
159
+ <div className="task-list">
160
+ {questions.map(q => (
161
+ <button key={q.id}
162
+ className={`task-button ${selectedQuestionId === q.id && isRunning ? 'running' : ''}`}
163
+ onClick={() => runAgent(q.id)}
164
+ disabled={isRunning}>
165
+ {q.question}
166
+ </button>
167
+ ))}
168
+ </div>
169
+ </div>
170
+ <div className='right-panel'>
171
+ <div className='agent-header'>
172
+ EHR Navigator Agent
173
+ </div>
174
+ <div className="agent-vertex-arrow">
175
+ {isVertexFlowActive && <ArrowFlow reverseFlow={isVertexFlowReversed} />}
176
+ </div>
177
+ <img src='static/agent.svg' alt='Agent' className='agent agent-image' />
178
+ <div className='agent-fhir-arrow'>
179
+ {isFhirFlowActive && <ArrowFlow direction='ltr' reverseFlow={isFhirFlowReversed} />}
180
+ </div>
181
+ <div className={`vertex ${isVertexWorking ? 'working' : ''}`}>
182
+ <div className='gcp-resource-frame'>
183
+ <img src='static/vertex-ai.svg' alt='Vertex' className='vertex-image' />
184
+ <img src='static/medgemma.avif' alt='MedGemma'/>
185
+ <div>(Vertex AI)</div>
186
+ </div>
187
+ </div>
188
+ <div className='event-log'>
189
+ {output.map((event) => (
190
+ <div key={event.id} className="event-item">{event.text}</div>
191
+ ))}
192
+ </div>
193
+ <div className='fhir'>
194
+
195
+ <div className="gcp-resource-frame">
196
+ <img src='static/fhir-colors.svg' alt='FHIR' className='fhir-image' />
197
+ <div>Electronic Health Record (EHR)</div>
198
+ <div>(FHIR Store)</div>
199
+ </div>
200
+ </div>
201
+ {isRunning && intermediateData && <div className='answer'>
202
+ <Markdown content={intermediateData} />
203
+ </div>}
204
+ {answer && <div className='answer'>
205
+ {questions.find(q => q.id === selectedQuestionId)?.question}<br/><br/>
206
+ <strong>Answer: </strong> <Markdown content={answer} />
207
+ </div>}
208
+ </div>
209
+ </div>
210
+ </div>
211
+ </div>
212
+ );
213
+ }
214
+
215
+ function Introduction({ onStart }) {
216
+ return (
217
+ <div className="intro-page">
218
+ <header className="intro-header">
219
+ <img src='static/medgemma.avif' className='logo' />
220
+ </header>
221
+ <main className="intro-content">
222
+ <section className="diagram-section">
223
+ <img src='static/intro.svg' alt='Agent Diagram' />
224
+ </section>
225
+ <section className="text-section">
226
+ <h1>EHR Navigator Agent</h1>
227
+ <p>In a clinical setting, agents are crucial for navigating and utilizing vast Electronic Health Record (EHR) data, often stored in FHIR format. An agent can efficiently answer specific questions or perform tasks related to a patient by intelligently fetching the most relevant information from their potentially very large and complex record.</p>
228
+ <p>This demo showcases how an agent can use <a href="https://developers.google.com/health-ai-developer-foundations/medgemma" target="_blank">MedGemma’s</a> comprehension of Fast Healthcare Interoperability Resources (FHIR) standard to intelligently navigate patient's health records. The agent first identifies what information is available, then plans how to retrieve the relevant parts. It fetches data in steps, extracting key facts along the way, and finally combines all these facts to provide a complete answer. This is a simplified example to illustrate the process. All patient data in this demo is synthetic, generated by Synthea (github.com/synthetichealth/synthea).</p>
229
+ <div className='disclaimer'>
230
+ <p><span className='disclaimer-badge'>Disclaimer</span> This demonstration is for illustrative purposes only and does not represent a finished or approved product. It is not representative of compliance to any regulations or standards for quality, safety or efficacy. Any real-world application would require additional development, training, and adaptation. The experience highlighted in this demo shows MedGemma's baseline capability for the displayed task and is intended to help developers and users explore possible applications and inspire further development.</p>
231
+ </div>
232
+ <button onClick={onStart} className="view-demo-button">View Demo</button>
233
+ </section>
234
+ </main>
235
+ </div>
236
+ );
237
+ }
238
+
239
+ function App() {
240
+ const [showIntro, setShowIntro] = useState(true);
241
+
242
+ useEffect(() => {
243
+ if(showIntro) {
244
+ document.body.classList.add('intro-active');
245
+ } else {
246
+ document.body.classList.remove('intro-active');
247
+ }
248
+ }, [showIntro]);
249
+
250
+ if (showIntro) {
251
+ return <Introduction onStart={() => setShowIntro(false)} />;
252
+ }
253
+
254
+ return <AgentRunner onBack={() => setShowIntro(true)} />;
255
+ }
256
+
257
+ const root = ReactDOM.createRoot(document.getElementById('root'));
258
+ root.render(<App />);
static/style.css ADDED
@@ -0,0 +1,531 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ @import url('https://fonts.googleapis.com/css2?family=Material+Symbols+Outlined');
2
+ @import url('https://fonts.googleapis.com/css2?family=Google+Sans+Flex:opsz,wdth,wght,ROND@6..144,87.2,1..1000,69&display=swap');
3
+
4
+ body {
5
+ font-family: "Google Sans Flex", sans-serif;;
6
+ margin: 0;
7
+ background-color: #fff;
8
+ background-image: url(background.jpg);
9
+ background-size: cover;
10
+ background-repeat: no-repeat;
11
+ background-attachment: fixed;
12
+ }
13
+
14
+ /* Dim overlay for better text readability */
15
+ body:not(.intro-active)::before {
16
+ content: "";
17
+ position: fixed;
18
+ top: 0;
19
+ left: 0;
20
+ right: 0;
21
+ bottom: 0;
22
+ background-color: rgba(255, 255, 255, 0.8);
23
+ z-index: -1;
24
+ }
25
+
26
+ body.intro-active::before {
27
+ content: "";
28
+ position: fixed;
29
+ top: 0;
30
+ left: 0;
31
+ right: 0;
32
+ bottom: 0;
33
+ background: linear-gradient(to right, rgba(255,255,255,0.4) 0%, rgba(255,255,255,0.95) 50%, rgba(255,255,255,1) 100%);
34
+ z-index: -1;
35
+ }
36
+
37
+ .container {
38
+ max-width: 800px;
39
+ margin: 0 auto;
40
+ }
41
+
42
+ .form-group {
43
+ margin-bottom: 20px;
44
+ }
45
+
46
+ #question-select {
47
+ width: 100%;
48
+ padding: 10px;
49
+ }
50
+
51
+ #run-agent-btn {
52
+ padding: 10px 20px;
53
+ cursor: pointer;
54
+ }
55
+
56
+ .result-container {
57
+ margin-top: 20px;
58
+ border: 1px solid #ccc;
59
+ padding: 20px;
60
+ }
61
+
62
+ #output {
63
+ white-space: pre-wrap;
64
+ }
65
+
66
+ /* Introduction page styles */
67
+ .intro-page {
68
+ max-width: 1200px;
69
+ margin: 0 auto;
70
+ padding: 20px;
71
+ position: relative;
72
+ z-index: 1;
73
+ }
74
+
75
+ .intro-header {
76
+ display: flex;
77
+ justify-content: flex-end;
78
+ padding: 20px 0;
79
+ }
80
+
81
+ .logo {
82
+ height: 25px;
83
+ }
84
+
85
+ .intro-content {
86
+ display: flex;
87
+ gap: 80px;
88
+ align-items: center;
89
+ min-height: 70vh;
90
+ padding: 0 5%;
91
+ }
92
+
93
+ .diagram-section {
94
+ flex: 1;
95
+ display: flex;
96
+ justify-content: center;
97
+ align-items: center;
98
+ }
99
+
100
+ .diagram-section img {
101
+ max-width: 100%;
102
+ height: auto;
103
+ }
104
+
105
+ .text-section {
106
+ flex: 1;
107
+ }
108
+
109
+ .text-section h1 {
110
+ margin-top: 0;
111
+ font-weight: normal;
112
+ font-size: 2.5em;
113
+ color: #202124;
114
+ }
115
+
116
+ .text-section ul {
117
+ list-style: disc;
118
+ padding-left: 20px;
119
+ line-height: 1.8;
120
+ color: #3c4043;
121
+ }
122
+
123
+ .disclaimer {
124
+ margin: 30px 0;
125
+ font-size: 0.85em;
126
+ line-height: 1.6;
127
+ color: #5f6368;
128
+ }
129
+
130
+ .disclaimer-badge {
131
+ background-color: #5f6368;
132
+ color: white;
133
+ padding: 2px 12px 2px 12px;
134
+ border-radius: 16px;
135
+ font-size: 1em;
136
+ font-weight: 500;
137
+ white-space: nowrap;
138
+ margin-right: 6px;
139
+ display: inline-block;
140
+ transform: translateY(-1px);
141
+ }
142
+
143
+ .disclaimer p {
144
+ margin: 0;
145
+ }
146
+
147
+ .view-demo-button {
148
+ background-color: #202124;
149
+ color: white;
150
+ padding: 12px 28px;
151
+ border: none;
152
+ border-radius: 25px;
153
+ cursor: pointer;
154
+ font-size: 1em;
155
+ }
156
+ .view-demo-button:hover {
157
+ opacity: 0.9;
158
+ }
159
+
160
+ .agent-page-container {
161
+ max-width: 1200px;
162
+ min-height: 100vh;
163
+ margin: 0 auto;
164
+ display: flex;
165
+ flex-direction: column;
166
+ }
167
+
168
+ /* Agent page styles */
169
+ .demo-frame {
170
+ border-radius: 28px;
171
+ box-shadow: 0 4px 12px rgba(0, 0, 0, 0.05);
172
+ overflow: hidden;
173
+ border: 7px #292929 solid;
174
+ backdrop-filter: blur(20px);
175
+ background: linear-gradient(180deg, rgba(255, 255, 255, 0.70) 0%, rgba(255, 255, 255, 0.10) 100%);
176
+ border-bottom: none;
177
+ border-bottom-left-radius: 0;
178
+ border-bottom-right-radius: 0;
179
+ flex: 1;
180
+ }
181
+
182
+ @media (max-width: 800px) {
183
+ .agent-container {
184
+ flex-direction: column;
185
+ align-items: center;
186
+ }
187
+ }
188
+
189
+ .demo-header {
190
+ display: flex;
191
+ justify-content: space-between;
192
+ align-items: center;
193
+ padding: 20px;
194
+ }
195
+
196
+ .back-button, .details-button {
197
+ background: none;
198
+ border: none;
199
+ cursor: pointer;
200
+ font-size: 0.9em;
201
+ }
202
+
203
+ .back-button {
204
+ color: #5f6368;
205
+ }
206
+
207
+ .details-button {
208
+ background-color: #0B57D0;
209
+ border-radius: 16px;
210
+ padding: 8px 16px;
211
+ display: flex;
212
+ align-items: center;
213
+ gap: 6px;
214
+ color: white;
215
+ font-size: 14px;
216
+ font-weight: 500;
217
+ line-height: 20px;
218
+ }
219
+
220
+ .details-button .material-symbols-outlined {
221
+ font-size: 14px;
222
+ background-color: #fff;
223
+ color: #0B57D0;
224
+ border-radius: 4px;
225
+ padding: 3px;
226
+ line-height: 1;
227
+ font-weight: 700;
228
+ }
229
+
230
+ .agent-container {
231
+ display: flex;
232
+ min-height: 700px;
233
+ }
234
+
235
+ /*fhir-arrow*/
236
+ .agent-fhir-arrow .dashed{
237
+ stroke-dasharray: 5,12;
238
+ }
239
+
240
+ .left-panel {
241
+ width: 300px;
242
+ border-right: 2px white solid;
243
+ padding: 30px;
244
+ text-align: center;
245
+ }
246
+
247
+ .left-panel h4 {
248
+ margin-top: 0;
249
+ margin-bottom: 20px;
250
+ font-size: 1.1em;
251
+ font-weight: 500;
252
+ color: #202124;
253
+ }
254
+ .left-panel h5 {
255
+ margin-top: 30px;
256
+ margin-bottom: 15px;
257
+ font-size: 1em;
258
+ font-weight: 500;
259
+ color: #3c4043;
260
+ }
261
+
262
+ .clinician-image {
263
+ width: 120px;
264
+ height: 120px;
265
+ border-radius: 50%;
266
+ object-fit: cover;
267
+ margin-bottom: 20px;
268
+ box-shadow: 0 4px 6px rgba(0, 0, 0, 0.4);
269
+ }
270
+
271
+ .task-list {
272
+ display: flex;
273
+ flex-direction: column;
274
+ gap: 10px;
275
+ }
276
+
277
+ .task-button {
278
+ background-color: #fff;
279
+ border: 1px solid #dadce0;
280
+ border-radius: 20px;
281
+ padding: 12px 18px;
282
+ cursor: pointer;
283
+ text-align: left;
284
+ font-size: 0.9em;
285
+ line-height: 1.5;
286
+ color: #3c4043;
287
+ transition: background-color 0.2s ease;
288
+ }
289
+
290
+ .task-button:hover {
291
+ background-color: #f1f3f4;
292
+ }
293
+
294
+ .task-button.running {
295
+ background-color: #e8f0fe;
296
+ border-color: #d2e3fc;
297
+ color: #1967d2;
298
+ }
299
+
300
+ .right-panel {
301
+ flex-grow: 1;
302
+ padding: 30px;
303
+ width: 100%;
304
+ overflow-y: auto;
305
+ display: grid;
306
+ grid-template-columns: 1fr 150px 100px 150px 100px 150px 1fr;
307
+ grid-template-rows: 20px 150px 250px 1fr;
308
+ grid-template-areas:
309
+ " . . . agent-header . . ."
310
+ ". agent-vertex-arrow agent-vertex-arrow agent agent-fhir-arrow agent-fhir-arrow ."
311
+ ". vertex event-log event-log event-log fhir ."
312
+ ". answer answer answer answer answer .";
313
+ justify-items: stretch;
314
+ align-items: stretch;
315
+ justify-content: stretch;
316
+ align-content: stretch;
317
+ }
318
+
319
+ .right-panel .agent-header {
320
+ grid-area: agent-header;
321
+ text-align: center;
322
+ }
323
+
324
+ .right-panel .agent-vertex-arrow {
325
+ grid-area: agent-vertex-arrow;
326
+ }
327
+
328
+ .right-panel .agent-fhir-arrow {
329
+ grid-area: agent-fhir-arrow;
330
+ }
331
+
332
+ .right-panel .agent {
333
+ grid-area: agent;
334
+ }
335
+
336
+ .right-panel .vertex {
337
+ grid-area: vertex;
338
+ text-align: center;
339
+ }
340
+
341
+ .vertex.working .gcp-resource-frame {
342
+ animation: shadow-wobble 2s infinite ease-in-out 2s;
343
+ }
344
+
345
+ .right-panel .gcp-resource-frame {
346
+ display: flex;
347
+ align-items: center;
348
+ gap: 10px;
349
+ background: radial-gradient(ellipse 72.15% 73.35% at 50.24% 46.53%, rgba(255, 255, 255, 0.70) 0%, rgba(255, 255, 255, 0.20) 100%); border-radius: 28px; border: 2px white solid;
350
+ backdrop-filter: blur(20px);
351
+ flex-direction: column;
352
+ padding-bottom: 10px;
353
+ margin-top: 10px;
354
+ padding: 15px;
355
+ }
356
+
357
+ .right-panel .gcp-resource-frame img {
358
+ max-width: 100%;
359
+ height: auto;
360
+ }
361
+
362
+ .right-panel .event-log {
363
+ margin: 15px;
364
+ grid-area: event-log;
365
+ overflow-y: hidden;
366
+ display: flex;
367
+ flex-direction: column;
368
+ gap: 5px;
369
+ }
370
+
371
+ .event-item {
372
+ line-height: 1.6;
373
+ color: #3c4043;
374
+ animation: slideIn 0.3s ease-out;
375
+ opacity: 0;
376
+ padding: 5px 10px;
377
+ border-radius: 8px;
378
+ overflow: hidden;
379
+ transition: opacity 0.5s ease-out, transform 0.5s ease-out, background-color 0.5s ease;
380
+ transform-origin: top center;
381
+ min-height: fit-content;
382
+ background-color: rgba(200, 200, 200, 0.3);
383
+ text-align: center;
384
+ }
385
+
386
+ .event-item:nth-child(1) { font-weight: bold; opacity: 1; background-color: skyblue; }
387
+ .event-item:nth-child(n+2) { opacity: 0.9; transform: scale(0.9); }
388
+ .event-item:nth-child(n+5) { opacity: 0.1; transform: scale(0.9); }
389
+
390
+ .right-panel .answer {
391
+ grid-area: answer;
392
+ background-color: #ECFAFF;
393
+ padding: 15px;
394
+ border-radius: 8px;
395
+ height: fit-content;
396
+ overflow-y: auto;
397
+ animation: fadeInSlideUp 0.5s ease-out forwards;
398
+ }
399
+
400
+ .right-panel .fhir {
401
+ grid-area: fhir;
402
+ text-align: center;
403
+ }
404
+
405
+ .right-panel h4 {
406
+ margin-top: 0;
407
+ margin-bottom: 20px;
408
+ font-size: 1.1em;
409
+ font-weight: 500;
410
+ color: #202124;
411
+ }
412
+
413
+ @keyframes slideIn {
414
+ from {
415
+ max-height: 0;
416
+ opacity: 0;
417
+ padding-top: 0;
418
+ padding-bottom: 0;
419
+ }
420
+ to {
421
+ max-height: 3em;
422
+ opacity: 1;
423
+ padding-top: 5px;
424
+ padding-bottom: 5px;
425
+ }
426
+ }
427
+
428
+ @keyframes shadow-wobble {
429
+ 0%, 100% {
430
+ box-shadow: 0 0 8px 4px rgba(66, 133, 244, 0.5);
431
+ }
432
+ 50% {
433
+ box-shadow: 0 0 16px 8px rgba(66, 133, 244, 0.8);
434
+ }
435
+ }
436
+
437
+ @keyframes fadeInSlideUp {
438
+ from {
439
+ opacity: 0;
440
+ transform: translateY(20px);
441
+ }
442
+ to {
443
+ opacity: 1;
444
+ transform: translateY(0);
445
+ }
446
+ }
447
+
448
+ .popup-overlay {
449
+ position: fixed;
450
+ top: 0;
451
+ left: 0;
452
+ right: 0;
453
+ bottom: 0;
454
+ background-color: rgba(0, 0, 0, 0.5);
455
+ display: flex;
456
+ justify-content: center;
457
+ align-items: center;
458
+ z-index: 1000;
459
+ }
460
+
461
+ .popup-content {
462
+ background-color: white;
463
+ padding: 20px 30px;
464
+ border-radius: 18px;
465
+ box-shadow: 0 2px 10px rgba(0, 0, 0, 0.1);
466
+ max-width: 600px;
467
+ position: relative;
468
+ color: #3c4043;
469
+ }
470
+
471
+ .popup-content .dialog-title-text {
472
+ font-weight: 400;
473
+ padding-right: 20px;
474
+ margin-bottom: 1.5em;
475
+ }
476
+
477
+ .popup-close-button {
478
+ position: absolute;
479
+ top: 10px;
480
+ right: 10px;
481
+ background: none;
482
+ border: none;
483
+ font-size: 2em;
484
+ cursor: pointer;
485
+ color: #5f6368;
486
+ }
487
+
488
+ .popup-button {
489
+ margin-top: 15px;
490
+ padding: 10px 24px;
491
+ border: none;
492
+ border-radius: 100px;
493
+ background-color: #0B57D0;
494
+ color: white;
495
+ cursor: pointer;
496
+ font-size: 0.9em;
497
+ }
498
+
499
+ .hf-logo {
500
+ height: 1.1em;
501
+ vertical-align: text-bottom;
502
+ margin-left: 1px;
503
+ }
504
+
505
+ .popup-content p,
506
+ .popup-content li {
507
+ line-height: 1.6;
508
+ margin-bottom: 1em;
509
+ }
510
+
511
+ .popup-content a {
512
+ color: #0B57D0;
513
+ text-decoration: none;
514
+ }
515
+ .popup-content a:hover {
516
+ text-decoration: underline;
517
+ }
518
+
519
+ .popup-content ul {
520
+ margin-top: 0;
521
+ padding-left: 20px;
522
+ }
523
+ .popup-content li {
524
+ margin-bottom: 0.5em;
525
+ }
526
+
527
+ .popup-footer {
528
+ display: flex;
529
+ justify-content: flex-end;
530
+ margin-top: 20px;
531
+ }
static/vertex-ai.svg ADDED
templates/index.html ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <!DOCTYPE html>
2
+ <html lang="en">
3
+ <head>
4
+ <meta charset="UTF-8">
5
+ <meta name="viewport" content="width=device-width, initial-scale=1.0">
6
+ <title>EHR Navigator Agent Demo</title>
7
+ <link rel="stylesheet" href="{{ url_for('static', filename='style.css') }}">
8
+ <script src="https://unpkg.com/react@18/umd/react.development.js" crossorigin></script>
9
+ <script src="https://unpkg.com/react-dom@18/umd/react-dom.development.js" crossorigin></script>
10
+ <script src="https://unpkg.com/@babel/standalone/babel.min.js"></script>
11
+ <script src="https://cdn.jsdelivr.net/npm/markdown-it@14.0.0/dist/markdown-it.min.js"></script>
12
+ </head>
13
+ <body>
14
+ <div id="root"></div>
15
+ <script src="https://cdnjs.cloudflare.com/ajax/libs/gsap/3.12.2/gsap.min.js"></script>
16
+ <script src="https://cdnjs.cloudflare.com/ajax/libs/gsap/3.12.2/MotionPathPlugin.min.js"></script>
17
+ <script src="{{ url_for('static', filename='arrow.js') }}" type="text/babel"></script>
18
+ <script src="{{ url_for('static', filename='dialog.js') }}" type="text/babel"></script>
19
+ <script src="{{ url_for('static', filename='script.js') }}" type="text/babel"></script>
20
+ </body>
21
+ </html>