[var] => cut_url
<pre>Array ( [var] => cut_url ) </pre> Resume for David H. for Consultant / Computer Software in Denver, Colorado. Search More Resumes for Consultant on #ZOEG11SRC

Search Resumes

Post Jobs

Job Seekers:
Post Your Resume


Resume for David H. for Consultant / Computer Software in Denver, Colorado

Occupation: Consultant Industry: Computer Software
Country: United States City: Denver
State: Colorado ZIP: 80129

View Complete Resume   Download Resume [
    [name] => cut_url
Share Share

< Back to search results Preview: For the complete resume and contact info please download it.

    [var] => content
    [type] => preview
    [var] => cut_url

Experience Summary 

+ years of software architecture and development experience including:  

  • Data warehouse architecture, in both Kimball and Inmon styles.
  • Data Modeling for using a variety of popular ERD tools. Data modeling for database appliance type databases e.g. Teradata.
  • Database and table design and implementation for data warehouse and related ETL processes.
  • ETL architecture.
  • Full software life cycle development across numerous software and hardware platforms in a variety of industries.
  • Analysis, design, development, and implementation of data warehouse, ETL, client/server, and mainframe transactional applications.
  • + years Informatica experience including Version ..
  • Structured and iterative data warehouse and application development lifecycle methodologies.
  • Production and development Oracle g, Sybase, Informix and DB UDB database administration.
  • Internal infrastructure improvement and knowledge management analysis.

    Technical Experience 

    Tools: Informatica PowerCenter versions – , , Talend,

    Power Designer, Brio Query and Enterprise Server, ERWin, Microsoft Office Toolset, Visio, Oracle Designer, Erwin

    Operating Systems: UNIX Sun Solaris, AIX, HPUX, Linux, Windows NT/
    Databases: Oracle , , i, i , g database/associated tools/Oracle Enterprise Manager, Teradata, DB UDB EE/EEE, Sybase ASE, Sybase IQ, Microsoft SQL Server, Microsoft Access, Flat file systems, MYSQL
    Languages: SQL, PL/SQL , TSQL, UNIX shell scripts , COBOL, Perl, PHP, JavaScript, C, C++
    Industries: Finance, Telecommunications, Cable, Government, Manufacturing, Banking



    Sports Authority Present  

    Produced ETL Architecture recommendations for loading data into Teradata and Oracle databases using Informatica’s pushdown optimization.  Used Informatica's Teradata Parallel Transporter Write TPT to populate Teradata warehouse. Mentored junior developers on Teradata SQL and Informatica. 

    Electrolux  Aug – Apr  

    As an Informatica/SAP consultant designed and created  mappings/sessions/workflows to facilitate a data conversion from JD Edwards Financial software to SAP. Created reproducible architecture used in future datamigrations using Informatica’s Power Connections to SAP. Moved data using SAP components for Informatica. Used Idoc and LSMW, SQL Server technologies.  

    Datasource Consulting / Transfirst Jan – Jun  

    As an ETL consultant designed and created + mappings/sessions/workflows to move data from source to reporting data warehouse for complex Cognos Business Intelligence project in credit card processing industry. Used Informatica in MS Window / SQL Server environment. Created standard and best practices for Talend ETL components and jobs.  Implemented custom Change Data Capture solutions. 

    Echostar July – April  

    As an Informatica consultant streamlined and tuned accounting related ETL to be more efficient using Teradata, Oracle and SQL Server databases.  

    Created architectural changes to loading Teradata to work with new features of Informatica. Changed design of mappings to take advantage of  push down optimization. Performance and tuning changes to better utilize Teradata’s parallel hardware architecture. 

    Improved data load performance in Teradata by redesigning to utilize native database appliance loaders mload, fastload, parallel data transporter . Designed alternative design approaches to handle Teradata's table level locking architecture. Helped balance data over Teradata 'Amps' in order to improve performance. 

    Processing time for the accounting close period was reduced from hours to hours. Assist others in maximizing the use of Informatica in ETL processes. 

    Jeppesen Jan – April  

    As an Informatica consultant designed and implemented performance improvement changes to nightly ETL processes using Informatica and Sybase and Oracle databases. 

    Newmont Mining June  

    As a Consultant modified mappings, data models and application to use new data sources. Backend databases changed from Oracle to SQL Server. application used a service oriented architecture to retrieve data. 

    Canadian Imperial Bank of Commerce CIBC  Mar – April  

    As an Application Architect reviewed database design, data flow, ETL and application performance in order to recommend hardware size needed for implementation of a Basel II risk management application. Data warehouse / Basel application was implemented using Oracle g RAC, Informatica .., SAS and various reporting tools. 

  • Created hardware sizing test plan for benchmarking application timings in Sun Microsystem’s hardware lab.
  • Collaborated with diverse development groups to create benchmark timings on internal hardware.
  • Recommend database infrastructure changes to improve performance.
  • Recommend data modeling changes and ETL changes to improve maintainability, data quality, best practices and performance.

    Time Warner Cable  Feb – Feb  

    As an Data Warehouse Architect and Informatica/ETL consultant, architected and developed ETL software modifications for a data warehouse and associated data marts in Informatica Version . 

  • Data modeling using Erwin
  • Source data analysis, table design using ErWin, improved cube designs for ease of loading.
  • Changed architecture and mappings as needed to improve performance.
  • Business analysis and data model design for new tables related to ETL.
  • Design and create database tables needed for application.

    Johns Manville  June – Dec  

    As an Informatica/ETL consultant, designed, modeled, developed ETL software for a data mart in Informatica. Data was sourced from Oracle databases. Data was utilized by Brio reports and fed into Elevon mainframe financial application. 

  • Rearchitected data flow for efficiency.
  • Created / revised data models
  • Created and maintained mappings as needed.
  • Tested application and implemented into production.
  • Data Warehouse and Data Mart design and data modeling.
  • Business analysis and data model design for new tables related to ETL.

    Janus Funds  Feb – Sep  

    Lead migration of company wide Informatica upgrade from version to version  

  • Created upgrade plan to migrate all internal applications populating data marts and data warehouse to new version.
  • Implemented upgrade plan including testing of all production mappings, sessions, workflows.
  • Modified mappings, sessions, workflows as needed to get them to work as designed in upgraded environment.

    Informatica / Core Integration Partners / FBI Feb – Mar  

    Informatica internals development. Created an advanced external procedure module to allow Informatica mappings on SUN platforms to execute PERL scripts. Functionality was needed for a project for the FBI. This module is now available as part of versions & of Informatica. 

  • Created advanced external procedures to test PERL functionality
  • Data code interface to PERL using PERL & Informatica APIs
  • Created development environment
  • Tested external procedure.

    Xcel Energy  June – Jan  

    As an Data Warehouse consultant, developed  ETL software for a new data warehouse and associated data marts in Informatica Versions .. 

  • Created and maintained mappings as needed.
  • Data Warehouse and Data Mart design and data modeling.
  • Business analysis and data model design for new tables related to ETL.

    Morgan Stanley  December – May  

    As an Informatica/ETL consultant, developed  ETL software for enhancement to operational datamarts in Informatica Versions . & .: 

  • Created and maintained mappings as needed.
  • Business analysis and data model design for new tables related to ETL.

    Xcel Energy Markets  November  

    As an Informatica/ETL consultant, lead an upgrade of ETL software to Informatica Version : 

  • Data modeling using Oracle Designer
  • Converted ETL from Informatica PowerCenter/PowerMart Version . to Version .
  • Trained development team in Version upgrade process and new features.
  • Test converted ETL to verify existing functionality was retained or improved.

    Morgan Stanley Dean Witter Online  April – December  

    As an Informatica/ETL consultant, performed in a pivotal role leading upgrade of ETL software to Informatica Version : 

  • Converted ETL from Informatica PowerCenter/PowerMart Version . to Version . Changed existing mappings to take advantage of new features that improve performance.
  • Trained development team in Version upgrade process and new features.
  • Modified company practices and procedures best practices to accommodate differences in Version .
  • Lead conversion of other projects to Version .

    On an ongoing data warehouse project providing financial data to Brio reports, responsibilities included

  • Data modeling using Erwin
  • Technical mentoring of the ETL team
  • Architecting and design of data warehouse ETL processes.
  • Conversion and migration of ETL functionality from Sybase to DB UDB EEE databases.
  • Reengineering of ETL to take advantage of new database design. The new database has a different data model and the ETL required significant rework. It is also significantly larger GB vs. GB.

    As part of a data conversion and performance improvement project, implemented improvements to the ETL by,

  • Created new batches that utilized parallelism better.
  • Analyzed and tested scheduling strategies to fine tune our Informatica batches.
  • Replaced lookups in mappings with join conditions.
  • Broke up complicated slow mapping into multiple mapping that ran much faster and could be run in parallel.
  • Analyzed usage and cached vs. noncached lookups and made changes as needed.
  • Simplified mappings by removing unneeded objects.
  • Removed ‘default’ setting for ports in expression transformations.
  • Minimized the usage of iif statements and other logical processing in mappings where practical.
  • Used monitoring tools in UNIX and the database as part of the tuning process. top, sar, dbArtisan, sp_monitor, sp_sysmon
  • Reviewed hardware architecture and recommended changes.
  • Cleaned up mappings to not create lengthy log files by turning off verbose logging and getting rid of warning messages.
  • Used perl and shell scripts to preprocess data in UNIX.
  • Implemented a partitioning strategy similar to Informatica’s partitioning in version to increase parallelism.
  • Implemented documentation standards and practices to make mappings ea

  • Cancel
    Not Enough Credits
    Sorry, but you don't have enough credits to download this resume.

    Purchase more credits
    Not Available
    Sorry but this resume is not available for download. Please choose another!

    Confirm Download

    Would you like to download  for 1 credit?

    You have  credits left.

    Yes No 

    Don't ask me again
    Confirm View Complete Resume

    Would you like to view  for 1 credit?

    You have  credits left.

    Yes No 

    Don't ask me again