Data Engineer - JP00002596, JP00002595
--VStream Labs Recruitment--
This role will be part of and a member of our Information Technology Enterprise Data Services Group. You will be responsible for leading the architecture, design, analysis, and implementation in a successful, and experienced team. You’ll be required to apply your depth of knowledge and expertise with both modern and legacy data platforms to develop data ecosystems that will meet business requirements and align with enterprise architecture goals and standards. We have embarked on an exciting journey to modernize, craft, and build a next-generation data platform to support the growing data needs of the business and to enable the capabilities of AI to drive business value.
We embrace a culture that challenges the status quo and constantly looks to efficiently simplify processes, technology, and workflow.
What you’ll do:
- Design, build, and operationalize large-scale enterprise data solutions in Hadoop, Postgres, Oracle, and Snowflake.
- Design and develop ETL pipelines to ingest data into Oracle/Postgres from different data sources (Files, Mainframe, Relational Sources, NoSQL, Hadoop, etc.) using Informatica BDM.
- Work will also encompass crafting and developing solution designs for data acquisition/ingestion of multifaceted data sets (internal/external), data integrations, and data warehouse/marts.
- Collaborate with business partners, product owners, partners, functional specialists, business analysts, IT architecture, and developers to develop solution designs adhering to architecture standards.
- Ensure solutions adhere to enterprise data governance and design standards.
- Act as a point of contact to resolve architectural, technical, and solution-related challenges for delivery teams to ensure efficiency.
- Advocate for the importance of data catalogs, data governance, and data quality practices.
- Apply outstanding problem-solving skills.
- Work in an Agile delivery framework to evolve data models and solution designs to deliver value incrementally.
- Be a self-starter with experience working in a fast-paced agile development environment.
- Demonstrate strong mentoring and coaching skills, leading by example for junior team members.
- Be outcome-focused with strong decision-making and critical thinking skills to challenge the status quo, impacting delivery pace and performance, and striving for efficiencies.
What you’ll bring:
- University degree in Computer Engineering or Computer Science.
- 7+ years of experience crafting solutions for data lakes, data integrations, data warehouses/marts.
- Proficient in Informatica IDQ to design, implement, and optimize data quality solutions for enterprise data management initiatives.
- Experienced Managed File Transfer (MFT) Specialist with expertise in MQFTE to ensure secure and efficient data exchange.
- Development and maintenance experience in automated scripts using Shell Scripting and Zena to streamline and optimize system operations and workflows.
- Solid grasp/experience with data technologies & tools (Hadoop, Oracle, PostgreSQL, Informatica, etc.).
- Outstanding knowledge and experience in ETL with the Informatica product suite.
- Experience implementing Data Governance principles and efficiencies.
- Familiar with Agile software development.
- Excellent verbal and written communication skills.
- Insurance knowledge is an asset, with the ability to foundationally understand complex business processes driving technical systems.