Overview
Sr. Data Engineer
What will you do:
- Build data pipelines to assemble large, complex sets of data that meet non-functional and functional business requirements
- Develop ETL solutions using Python, Powershell, SQL, SSIS etc. to load and automate complex datasets in PDF, Excel, flat files, JSON, XML, EDI etc.
- Take the full ownership of end-to-end data processes on Azure Cloud environments
- Work closely with data architect, SMEs and other technology partners to develop & execute data architecture and product roadmap
- Collaborate with the backend developers independently to understand the legacy applications and implement the features in a new system.
- Troubleshoot issues and other operational bottlenecks to support continuous data delivery for various applications.
- Take initiatives to make changes and improvements, work on technical debt, new and complex challenges.
- Implement complex warehouse views, make database design decisions to support the UI need, optimize scripts to periodically refresh large volume datasets.
- Perform code reviews and coach team members
- Develop reports on data dictionary, server metadata, data files and implement the reporting tools as needed.
- Implement best practices for data updates and development, troubleshoot performance and other data related issues on multiple product applications.
- Keep current on big data and data visualization technology trends, evaluate, work on proof-of concept and make recommendations on cloud technologies.
What you bring:
- 7+ years of data engineering experience working in partnership with large data sets and cloud architecture
- Deep experience in building data pipelines using the ETL tools and paradigms and loading data to and from RDBMS such Postgres, SQL Server, Oracle or similar.
- Proficient in cloud services technologies such as Microsoft Fabric, Azure Data Factory, Data Lake, or other related technologies such as AWS or GCP or Databricks
- Proficient in using SSDT tools for building SQL server relational databases, databases in Azure SQL, Analysis Services data models, Integration Services packages and Reporting Services reports
- Solid experience building data solutions with the programming languages such as Python, Powershell ,Spark, Scala
- Advanced T-SQL and ETL automation experience
- Experience working with orchestration tools such as Airflow and building complex dependency workflows
- Self-motivated with the ability to work and learn new technology independently
- Great problem-solving capabilities, troubleshooting data issues and experience in stabilizing big data systems.
- Excellent communication and presentation skills
Bonus points:
- Hands-on deep experience with cloud data migration, and experience working with analytic platforms like Fabric, Databricks on the cloud
- Certification in one of the cloud platforms (AWS/GCP/Azure)
- Experience with real-time data streaming tools like Kafka, Kinesis or any similar tools.
- Experience with the US health care reimbursement-related terminology and data is a plus
Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed