AI Workflow Engineer
Bethesda, MD
Temporary to Full Time
Experienced
We are looking for an AI Workflow Engineer to design, build, and operationalize AI-enabled workflows that integrate LLMs, retrieval systems, and biomedical data to support scientific and operational use cases for the National Institutes of Health (NIH) scientific databases. This is a part-time or full-time opportunity, available as either a contract role or a staff position, that will be 6 months in duration with the potential for extension. Candidates are not required to be based in the Washington, DC, area but must be willing to travel as needed. Candidates must be authorized to work in the United States.
The AI Workflow Engineer will work closely with stakeholders to prototype and implement high-impact pilot use cases, emphasizing practical usability, scalability, and alignment with real-world scientific workflows. This role is central to enabling structured, traceable, and explainable AI-assisted workflows that enhance productivity, improve data integration, and support reproducible scientific outcomes.
Job Responsibilities
AI Workflow Design & Development
Biomedical Data Integration & Domain Adaptation
Reproducibility, Transparency, & Evaluation
Pilot Implementation & Integration
Collaboration & Enablement
Required Skills/Experience
Desired Skills/Experience
The AI Workflow Engineer will work closely with stakeholders to prototype and implement high-impact pilot use cases, emphasizing practical usability, scalability, and alignment with real-world scientific workflows. This role is central to enabling structured, traceable, and explainable AI-assisted workflows that enhance productivity, improve data integration, and support reproducible scientific outcomes.
Job Responsibilities
AI Workflow Design & Development
- Design and implement end-to-end AI-enabled workflows supporting biomedical research, data analysis, and operational processes
- Develop agent-based systems that orchestrate multi-step reasoning, tool use, and interaction with structured and unstructured data sources
- Build and optimize RAG pipelines, including vector search, hybrid retrieval, and integration with structured data sources
- Implement tool-use frameworks (e.g., MCP or equivalent architectures) to enable LLM interaction with APIs, databases, and internal systems
- Develop workflows that integrate with NCBI data systems, APIs, and knowledge graphs, enabling structured retrieval and synthesis across resources
Biomedical Data Integration & Domain Adaptation
- Design approaches for entity grounding, normalization, and linking across biomedical concepts (e.g., genes, variants, diseases, publications)
- Apply AI methods to support data harmonization and linkage across NCBI datasets, including structured and semi-structured data
- Prototype and evaluate domain-adapted LLM approaches, including prompt engineering, retrieval optimization, and model re-weighting or fine-tuning where appropriate
- Ensure AI workflows reflect biomedical and bioinformatics contexts, including use of ontologies, identifiers, and domain-specific data structures
Reproducibility, Transparency, & Evaluation
- Develop traceable and reproducible AI workflows, including capture of inputs, prompts, intermediate steps, and outputs
- Design and implement evaluation frameworks to assess performance, reliability, and scientific validity of AI-assisted workflows
- Establish methods to document provenance, assumptions, and transformations in AI-driven outputs
- Develop test harnesses and metrics for quality, robustness, latency, and regression testing
Pilot Implementation & Integration
- Collaborate with stakeholders to identify and prioritize high-impact AI pilot use cases
- Build and iterate on prototypes, supporting real-world deployment and refinement
- Design integration approaches for incorporating AI workflows into existing systems and processes
- Provide technical input on scalability, maintainability, and operationalization
Collaboration & Enablement
- Work closely with training/instruction staff to ensure workflows are usable, understandable, and transferable Provide technical guidance and mentorship to staff implementing AI solutions
- Contribute to development of playbooks, templates, and reusable workflow components
Required Skills/Experience
- Bachelor’s or master’s degree in computer science, artificial intelligence, bioinformatics, data science, or related field
- 3–5+ years of experience building AI/ML-enabled applications or workflows in real-world environments
- Experience with:
- LLMs and generative AI
- RAG systems
- Agent-based architectures and multi-step AI workflows
- API development and system integration (e.g., Python, REST)
- Experience working with structured and unstructured data systems
- Ability to design and evaluate end-to-end AI workflows, not just individual components
Desired Skills/Experience
- Experience with tool-use frameworks or MCP-like architectures
- Experience integrating AI with biomedical or bioinformatics data systems (e.g., NCBI resources, genomic data, ontologies)
- Familiarity with entity normalization, knowledge graphs, or semantic data integration
- Experience with vector databases and hybrid retrieval systems
- Experience adapting models to domain-specific contexts (e.g., prompt tuning, retrieval tuning, fine-tuning) Experience working in federal, research, or regulated environments
Computercraft offers an excellent benefits package that includes health, dental, vision, and disability and life insurance; a 401(k) plan with matching; paid leave starting at 128 hours/year for the first 3 years of employment; and 11 paid holidays. We also offer the opportunity for a positive work–life balance with a standard 40-hour work week and the chance to work alongside a team of highly accomplished professionals.
To learn about other Computercraft job opportunities, please visit the Careers section of our website: https://www.computercraft-usa.com/.
EEO Employer – Disability/Veteran/Race/Color/Religion/Sex/National Origin/Genetic Information
Apply for this position
Required*