include all information provided in the text, do not omit anything
do not makeup not exist information
'''''' {YOUR_TEXT_HERE}
⭐
Scenario: Convert phrases to narrative
PM: 23:59, a new patient from ER
55F p/w Fever chills 3pm~. no sorethroat, rhinorrhea, oral ulcer, cough, chest pain, abdomen pain, dysuria or diarrhea. No headache or myalgia. TOCC negative . Mild DOE since C/T.
In UCU, 38.5; no shock, room air, neutropenia, high CRP level. CXR nl. Confirm with Dr.陳,gave cftz, G-CSF, admitted for infection ctrl. pend COVID and Influenza Ag
The Prompt
Goal: convert the information into medical "narrative" record in paragraph
date format: YYYY-MM-DD, e.g. 2023-10-09
include all information provided in the text, don't omit anything
do makeup not exist information
'''''' {text here}
⭐
Scenario: Convert narrative to bullet points
A 65-year-old female presented with complaints of lower back pain and stiffness that started in November 2021. She sought medical attention at a clinic, where the doctor diagnosed lumbar spine compression and recommended rehabilitation. However, her symptoms did not improve with the prescribed treatment. Subsequently, she visited the Neurosurgery Department at the Chinese Medical University, where an MRI revealed a fracture in the 11th thoracic vertebra (report not available). On 2021-12-16, she underwent vertebroplasty (bone cement injection) to address the fracture, but her lower back pain and stiffness persisted.
Additionally, during the course of evaluation, a lump was noticed in the right chest area, which has been progressively increasing in size. A biopsy of the right chest lump was performed on 2023-02-18, and the results revealed multiple myeloma (report not available). The patient is now seeking further treatment and care at our hospital.
Prompt
Goal: convert the narrative paragraph as bullet points
order each event chronologically
for each points/event, format as "- YYYY-MM-DD: event"
'''''' {text here}
⭐⭐
Scenario: Summarize the Note
This 60 y/o postmenopause woman, G1P1, has history of HTN, dyslipidemia, and osteoporosis for years with regular medical control. She has regular health check up and was found to have breast fibrocystic change and proliferative disease at our hospital since 2013. For left breast lesion, she also received thrice FNA of left breast lesion with normal finding in the past. On 2023/07/10 breast sono, there was disclosed BI-RAD:4, a L't 5-mm irregular hypoechoic lesion in 1/2cm, not demonstrated on previous sonogram on 2022/01/18.
On 2023/08/07 FNA to left 1/2 breast lesion showed suspicious for low grade carcinoma. Core biopsy of left 1/2 breast lesion confirmed INVASIVE LOBULAR CARCINOMA. Systemic work up was arranged and reported left breast 1/2 lobular cancer with cT1N0M0, ER: 8, PR: 5, HER2: +/+++. Treatment option was discussed. Under agreement, she was admitted for operation with left breast wide excision plus SLNB.
Prompt
Goal: as oncologist, summarize the information by following rules
markdown title h2 each points
patient info (age, gender, past history)
cancer type(anatomical, histologic),
cancer grade(high, intermediate, or low),
cancer stage: International Cancer Control (AJCC/UICC) tumor-node-metastasis (TNM) classification system
(continued)
(continued)
genetic mutations (if provided)
disease course; first points should always be "Initial presentation"
for each course or event, start with time information should always at the begin of line, such as:
transmit the output of process A to process B for further analysis.
don't making up nonexistent information.
truncate the message to fit within the character limit.
present it in a markdown styled bulleted list.
iterate through all items to ensure each one meets the required criteria.
chronologically
For each…
If … then … else …
Write clear instructions
Chain-of-Thought Prompting
小明 6 歲,哥哥 9 歲,小明 10 歲時,哥哥幾歲
WEI, Jason, et al. Chain-of-thought prompting elicits reasoning in large language models. Advances in Neural Information Processing Systems, 2022, 35: 24824-24837.
Step-Back Prompting 🙋
Before you start, please ask me any questions you have about this so I can give you more context. Be extremely comprehensive.
ZHENG, Huaixiu Steven, et al. Take a step back: evoking reasoning via abstraction in large language models. arXiv preprint arXiv:2310.06117, 2023.