ETL Developer (Informatica) in Kansas City, MO at GDH

Date Posted: 2/15/2018

Job Snapshot

Job Description

.bd_title { font-weight: bold; } The ETL Data Integration Specialist will be responsible for working closely with the Data Architects, Data Analyst’s and Power BI Development teams to produce ETL solutions in support of data warehousing and data management that follow industry best practices and company standards. The ETL Specialist will have strong programming and analytical skills with the ability to review, analyze and understand business requirements and internal database source and target state architecture. The role will cover full systems development life cycle (SDLC) phases including requirements analysis, system design, development, coding, testing, debugging, implementation, and post-implementation support _____________________________________________________________________________________ Knowledge & Skills
: Collaborate with business/data analysts, data architects, developers, database administrators to create conceptual, logical, and physical data models for Business Intelligence reporting databases. Review Business Requirements to identify ETL requirements and utilize Informatica/SSIS to develop ETL processes that support the creation of databases, table spaces, tables, indexes, triggers, procedures, and other database objects. Support the design, development, and implementation of data architecture for historical reporting and analysis. Defines the content and structure of the database (schema) and advises users on efficient techniques for extracting data. Design Informatica based customized ETL solutions to support the transition and optimization of data transfers between internal source databases to Azure Cloud data warehouse, based on specific Business requirements. Develop and apply advanced research techniques in the investigation and resolution of highly complex compatibility and interface design considerations. Advises development team on design factors that impact storage capacity, processing speed, and input/output requirements. Creative problem-solving skills with the ability to manage multiple projects. Factors emerging technologies and product supportability into design and implementation. Implement changes and provide post-implementation user support and system support. Demonstrate solid organizational and time management skills with ability to perform detailed tasks with high degree of accuracy and work and perform under pressure. Document project specific processes and data flows, data dependencies, data definitions and relevant business rules. Establish data dictionries to support the buildout of a data governance infrastructure. Leverage industry best practices in establishing repeatable BI practices, principles & processes. ____________________________________________________________________________________ Requirements:
Bachelor's Degree in Engineering, Computer Science, Information Systems or related field with programming coursework or equivalent database development experience. 8+ years of experience in IT Industry involving Analysis, Design, Development, Integration, Issue resolution, Maintenance, Production Implementation and Testing. 5+ years of experience in an Informatica ETL developer role with exposure to interfacing with DB2 SalesForce, SAP, SQL Server and Flat Files. Develop advanced ETL transformations supporting the creation of Business Intelligence data warehouses On Premise or in the Cloud. Advanced knowledge in the following; Informatica Power Center 9.x, Informatica Data Quality 9.x, Informatica Developer Client, Informatica Analyst tool 9.x, Informatica Power Exchange. Experience working with SAS Analytics Software & Solutions highly preferable. Expertise in analyzing business requirements, functional requirements and data specifications. Identify transformation rules, develop source to target mappings and apply SCD1, SCD2 logic to maintain current and historical data loads. Experience with designing and developing Informatica mappings using transformations such as Aggregator, External Procedure, Stored Procedure, Normalizer, Lookup, Filter, Joiner and Update Strategy. Strong Data Modeling experience in ODS, Dimensional Data Modeling Methodologies likes Star Schema, and Snowflake Schema. Experience with data warehouse tasks supporting requirements management, architecture design, analysis, mapping, modeling, integrity checking, impact analysis and regression testing. Excellent problem-solving skills, with a demonstrated ability to identify and solve problems. Strong oral and written communication skills _____________________________________________________________________________________ Preferable:
Experience working within the railroad industry and familiar with KPI’s supporting business and/or operations. Experience working with Informatica Cloud and Azure Cloud. Experience working within the Azure Cloud environment, Azure Data Lake, Azure DW, PaaS. Experience working with SAS.