Data Warehouse/ Snowflake Consultant Job at Morph Enterprise, Columbus, OH

dlN3dEVaMW56ajJaQXFhK2IxWUFGQm9pdFE9PQ==
  • Morph Enterprise
  • Columbus, OH

Job Description

Technical Specialist 4 / TS 4 is Hybrid role (2-3 days onsite in a week)

The contract manager will change the hybrid status based on the criticality, prioritization and project deadlines.

The Technical Specialist will be responsible for migrating the current data, framework and programs from the EDW IOP Big data environment to the EDW Snowflake environment. Technical Specialist will also involve in Medicaid Enterprise Data Warehouse design, development, implementation, migration, maintenance and operation activities. Works closely with Data Governance and Analytics team. Will be one of the key technical resources for ingesting the data to EDW Snowflake environment and to build new or support existing Data warehouses and DataMart's for data analytics and exchange with State and Medicaid partners. This position is a member of Medicaid ITS and works closely with the Business Intelligence & Data Analytics team.

Responsibilities:

  • Participate in Team activities, Design discussions, Stand up meetings and planning Review with team.
  • Provide Snowflake database technical support in developing reliable, efficient, and scalable solutions for various projects on Snowflake.
  • Ingest the existing data, framework and programs from EDW IOP Big data environment to the EDW Snowflake environment using the best practices.
  • Design and develop Snowpark features in Python, understand the requirements and iterate.
  • Interface with the open-source community and contribute to Snowflake's open-source libraries including Snowpark Python and the Snowflake Python Connector.
  • Create, monitor, and maintain role-based access controls, Virtual warehouses, Tasks, Snow pipe, Streams on Snowflake databases to support different use cases.
  • Performance tuning of Snowflake queries and procedures. Recommending and documenting the best practices of Snowflake.
  • Explore the new capabilities of Snowflake, perform POC and implement them based on business requirements.
  • Responsible for creating and maintaining the Snowflake technical documentation, ensuring compliance with data governance and security policies.
  • Implement Snowflake user /query log analysis, History capture, and user email alert configuration.
  • Enable data governance in Snowflake, including row/column-level data security using secure views and dynamic data masking features.
  • Perform data analysis, data profiling, data quality and data ingestion in various layers using big data/Hadoop/Hive/Impala queries, PySpark programs and UNIX shell scripts.
  • Follow the organization coding standard document, Create mappings, sessions and workflows as per the mapping specification document.
  • Perform Gap and impact analysis of ETL and IOP jobs for the new requirement and enhancements.
  • Create mockup data, perform Unit testing and capture the result sets against the jobs developed in lower environment.
  • Updating the production support Run book, Control M schedule document as per the production release.
  • Create and update design documents, provide detail description about workflows after every production release.
  • Continuously monitor the production data loads, fix the issues, update the tracker document with the issues, Identify the performance issues.
  • Performance tuning long running ETL/ELT jobs by creating partitions, enabling full load and other standard approaches.
  • Perform Quality assurance check, Reconciliation post data loads and communicate to vendor for receiving fixed data.
  • Participate in ETL/ELT code review and design re-usable frameworks.
  • Create Change requests, workplan, Test results, BCAB checklist documents for the code deployment to production environment and perform the code validation post deployment.
  • Work with Snowflake Admin, Hadoop Admin, ETL and SAS admin teams for code deployments and health checks.
  • Create re-usable framework for Audit Balance Control to capture Reconciliation, mapping parameters and variables, serves as single point of reference for workflows.
  • Create Snowpark and PySpark programs to ingest historical and incremental data.
  • Create SQOOP scripts to ingest historical data from EDW oracle database to Hadoop IOP, created HIVE tables and Impala views creation scripts for Dimension tables.
  • Participate in meetings to continuously upgrade the Functional and technical expertise.

REQUIRED Skill Sets:

  • Proficiency in Data Warehousing, Data migration, and Snowflake is essential for this role.
  • Strong Experience in the implementation, execution, and maintenance of Data Integration technology solutions.
  • Minimum (4-6) years of hands-on experience with Cloud databases.
  • Minimum (2-3) years of hands-on data migration experience from the Big data environment to Snowflake environment.

Skill

Required / Desired

Amount

of Experience

Proficiency in Data Warehousing, Data migration, and Snowflake is essential for this role.

Required

Strong Experience in the implementation, execution, and maintenance of Data Integration technology solutions.

Required

Minimum (4-6) years of hands-on experience with Cloud databases.

Required

Minimum (2-3) years of hands-on data migration experience from the Big data environment to Snowflake environment.

Required

Job Tags

Contract work, Remote job, 2 days per week, 3 days per week,

Similar Jobs

Ignite Spot Accounting

Bookkeeper - FT/Remote Job at Ignite Spot Accounting

 ...Data entry and word processing skills Well organized High school degree An associate's degree or relevant certification is a plus Benefits ~8 Paid Holidays ~ Training & Development ~ Work From Home ~ Salary $40,000 to $50,000 based on experience... 

University of Maryland Medical System

Pediatric Clinical Dietitian Job at University of Maryland Medical System

 ...Clinical Dietitian Under general supervision, collaborates with members of the healthcare team, attends daily rounds and provides nutrition care to pediatric inpatients in a fast-paced setting. Independently performs the nutrition care process, including nutrition... 

Hanna Interpreting Services LLC

Portuguese Interpreter Job at Hanna Interpreting Services LLC

Hanna Interpreting Services LLC is a language service provider that connects bilingual and multilingual individuals with potential opportunities to serve as a freelance interpreter or translator. Our mission is to bridge the communication gap for those in need through ...

University of Pennsylvania

Assistant Professor, Clinician Educator Track, Infectious Diseases Job at University of Pennsylvania

 ...Assistant Professor, Clinician Educator Track, Infectious Diseases Location: Philadelphia, PA Open Date: Aug 08, 2023 Deadline: Aug 08, 2025 at 11:59 PM Eastern Time Children's Hospital of Philadelphia and the Division of Infectious Diseases in the Department... 

Mission Community Hospital

ER nurse Job at Mission Community Hospital

The ER nurse at Mission Community Hospital plays a crucial role in providing high-quality care to patients in need of emergency medical attention during midshift 3pm-3am. This role requires quick thinking, strong medical knowledge, and exceptional communication skills...