Gathering your results ...
19 days
Not Specified
Not Specified
Not Specified
<p>MidAmerican Energy Company has an exciting career opportunity available. Take the next step in your career and apply now!</p> <p>Bachelor's degree in information systems, computer science or related technical field; or equivalent work experience. (Typically four years of related, progressive work experience would be needed for candidates applying for this position who do not possess a bachelor's degree.)</p> <p>Six or more years of experience with advanced knowledge of data architecture, cloud platforms (especially Azure), and enterprise data solutions is required for the sr level.</p> <p>Proficiency in data engineering tools and platforms, especially Azure Data Factory and Azure Databricks, Informatica Power Center and IICS, Oracle Data Integrator.</p> <p>Proficiency in Oracle DB, IBM DB2, Azure.</p> <p>Strong understanding of data modeling, ETL/ELT processes, and performance tuning of enterprise-level applications.</p> <p>Expert-level knowledge of data-related technologies from architecture to administration, including design, development, optimization, and licensing.</p> <p>Proven experience working in the utility industry is required</p> <p>Effective oral and written communication skills, with the ability to collaborate across teams and mentor junior engineers.</p> <p>Strong analytical and problem-solving abilities.</p> <p>Ability to prioritize and manage multiple tasks and projects concurrently.</p> <ul> <li> <p>Design and implement scalable data ingestion and transformation frameworks using one or more of the following:</p> </li><li> <p>Azure services enabling structured, semi-structured, and unstructured data to be efficiently processed and integrated into enterprise data platforms</p> </li><li> <p>Informatica Power Center & Informatica Cloud</p> </li><li> <p>Oracle Data Integrator</p> </li><li> <p>Build and maintain robust ETL/ELT pipelines.</p> </li><li> <p>Integrate data from diverse sources including on-premises systems, cloud storage, APIs, and streaming platforms.</p> </li></ul> <p>Informatica Development and Optimization</p> <ul> <li>Design, develop, test, and maintain ETL pipelines using Informatica PowerCenter, including performance tuning, error handling, and integration with Control-M scheduling. </li><li>Participate in the migration from PowerCenter to Informatica Cloud (IICS) by redesigning mappings, optimizing transformations, and supporting secure agent configurations. </li></ul> <p>Oracle Data Integrator</p> <ul> <li>Design, develop, test, and maintain ETL pipelines using Oracle Data Integrator, including performance tuning, error handling, and integration with Control-M scheduling. </li><li>Experience with Fusion AI Data Platform a plus (Fusion Data Intelligence, Fusion Analytics Warehouse). </li></ul> <p>Databricks Development and Optimization</p> <ul> <li>Develop and optimize notebooks and workflows in Azure Databricks using PySpark, SQL. </li><li>Implement Delta Lake for efficient data storage, versioning, and ACID transactions. </li><li>Leverage Databricks features such as Unity Catalog and job orchestration. </li></ul> <p>Data Modeling and Architecture</p> <ul> <li>Design and implement data models (star/snowflake schemas) for analytics and reporting. </li><li>Collaborate with architects to define data lakehouse architecture and best practices. </li><li>Hands-on experience implementing and optimizing data solutions using the Medallion Architecture (Bronze, Silver, Gold layers) for scalable and structured data processing </li></ul> <p>Data Quality and Governance</p> <ul> <li>Implement data validation, profiling, and cleansing routines. </li><li>Ensure compliance with data governance policies, including data lineage and metadata management. </li></ul> <p>Performance Tuning and Monitoring</p> <ul> <li>Monitor and optimize performance various data processes. </li><li>Troubleshoot and resolve issues related to data latency, job failures, and resource utilization. </li></ul> <p>Collaboration and Stakeholder Engagement</p> <ul> <li>Work closely with data scientists, analysts, and business units to understand data requirements. </li><li>Translate business needs into technical solutions that are scalable and maintainable. </li></ul> <p>Security and Compliance</p> <ul> <li>Implement role-based access control (RBAC), encryption, and secure data handling practices. </li><li>Ensure compliance with industry regulations (e.g., NERC CIP, GDPR, HIPAA if applicable). </li></ul> <p>Documentation and Best Practices</p> <ul> <li>Maintain clear documentation of data flows, architecture, and operational procedures. </li><li>Promote best practices in code versioning, testing, and CI/CD for data engineering. </li></ul>
POST A JOB
It's completely FREE to post your jobs on ZiNG! There's no catch, no credit card needed, and no limits to number of job posts.
The first step is to SIGN UP so that you can manage all your job postings under your profile.
If you already have an account, you can LOGIN to post a job or manage your other postings.
Thank you for helping us get Americans back to work!
It's completely FREE to post your jobs on ZiNG! There's no catch, no credit card needed, and no limits to number of job posts.
The first step is to SIGN UP so that you can manage all your job postings under your profile.
If you already have an account, you can LOGIN to post a job or manage your other postings.
Thank you for helping us get Americans back to work!