Technical Specialist – Medicaid Enterprise Data Warehouse
Drive innovation. Build impactful solutions. Support mission-critical healthcare systems.
We're looking for an experienced Technical Specialist to play a key role in designing, developing, and optimizing the Medicaid Enterprise Data Warehouse (EDW). This is your opportunity to work at the intersection of healthcare, big data, and analytics helping ensure Medicaid partners have the right data at the right time to improve lives.
You'll be joining a collaborative Medicaid ITS team, working closely with Data Governance and Business Intelligence professionals to deliver high-performing data platforms, critical data marts, and advanced data ingestion pipelines into our Big Data environment.
Why You'll Love This Role
-
Make a real-world impact – Your work supports healthcare initiatives that directly affect communities.
-
Work with cutting-edge tech – Cloudera, Hadoop, PySpark, Hive, Impala, Kafka, StreamSets, and more.
-
Collaborative team culture – Partner with highly skilled engineers, analysts, and governance experts.
-
Challenging, meaningful projects – From migrations to performance tuning, every day brings variety.
What You'll Do
-
Lead the design, development, and maintenance of enterprise-level data warehouse solutions.
-
Build efficient ETL/ELT workflows using Hadoop, PySpark, Sqoop, StreamSets, and UNIX scripting.
-
Perform data profiling, quality checks, and reconciliation to ensure accuracy and compliance (PHI/PII).
-
Create and optimize Hive and Impala tables for large-scale analytics workloads.
-
Monitor and tune performance for long-running jobs, ensuring smooth and timely data delivery.
-
Develop reusable frameworks and scripts to improve automation and consistency.
-
Collaborate across teams to deploy, document, and support production environments.
-
Stay ahead of trends by continuously expanding your technical expertise.
What You Bring
-
8 years in Data Warehousing or Data Integration with Big Data/Hadoop.
-
Deep hands-on experience with Cloudera Big Data technologies: Hadoop, Hive, Impala, Sqoop, PySpark, StreamSets, Kafka, HDFS, Oozie, etc.
-
Strong Oracle SQL/PL-SQL skills and proven ability to write, tune, and optimize complex queries.
-
Experience with ETL/ELT processes, dimensional modeling, and metadata standards.
-
Ability to troubleshoot, optimize, and deliver high-quality, production-ready code.
-
Knowledge of data security best practices with PHI/PII datasets.
-
Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience).
Bonus Points For
-
Leadership or mentoring experience in technical teams.
-
Familiarity with Agile and Waterfall methodologies.
-
Strong documentation skills and the ability to translate complex data concepts into clear terms.
Nesco Resource offers a comprehensive benefits package for our associates, which includes a MEC (Minimum Essential Coverage) plan that encompasses Medical, Vision, Dental, 401K, and EAP (Employee Assistance Program) services.
Nesco Resource provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws.