Quantcast

Data Migration Solutions Architect Resum...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Migration Solutions Architect
Target Location US-CT-Washington
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Data Migration Software Development New Haven, CT

Data Analyst Power Bi New Haven, CT

Machine Learning Data Science Leonia, NJ

Data Center Global Technology Fairfield, CT

Data Entry Scrum Master Simsbury, CT

Machine Learning Data Scientist West Haven, CT

Data Analyst Bronx, NY

Click here or scroll down to respond to this candidate
Candidate's Name ,PMPAWS Certified Solutions Architect AssociatePHONE NUMBER AVAILABLE(cell)EMAIL AVAILABLEPROFILE Over 27 years of IT experience Acted as Data warehousing Architect/DBA/ETL Tech Lead/Data Architect/Data Migration Lead A result driven professional that works well in breaking down the barriers seamlessly between technology and business functions, communicating a clear value proposition, and delivering momentum with greater efficiency and effectiveness for targeted goals. A proven contribution track record of individual and lead project teams to successful delivery of high performance, high-available, secure solutions of the highest quality coExpertise in ORACLE Database Administration, Data Migration and PL/SQL development. Expertise in ETL Development and design using Abinitio and Informatica Expertise in migration of SQL Server to Oracle Designed ETL process from various SAP sources having very high volume of data and integration of SAP COPA,SAP FICO Designed and developed Data Migration from EPS system to PRISM for Federal clients Designed and developed Data Migration from Tririga to Facility Management System Application Provided technical leadership and guidance to developers and QA working on development project teams Managed the SOX- related issues to make sure the application is SOX compliant. Coordinated and managed the infrastructure changes by recommending alternatives and planning the implementation steps to minimize the application impact Experienced in all phases of the Project Life Cycle. Good project management skills Extensive experience in Enterprise Data warehouse Design using different data warehouse design methodologies Expertise in creating data model Experienced on Web services (RESTful, SOAP) technologies Experience in Data Governance, Master Data Management, Data Strategy,Data governance, Data security, Data quality, Data lineage tools Worked with Federal clients Extensive experience Scope Management, providing the estimation, prepared estimation templates for DW/BI. Data Quality Management, Data Profiling, Data Assessment, Data Cleansing, Data Repository. Expertise in RPA, Big Data, AWS, Machine Learning Experience in Centralized & Distributed version control management tools like GitHub, Bitbucket, Subversion and CVS Expertise in implementation of TrueComp Strong skills in all phases of SDLC including Requirement Gathering and Analysis, Architecture Design, Coding, Unit Testing, Deploying, Performance Tuning and Maintenance on Java/J2EE technologies Proficiency in Microservices architecture Expertise in integrating, transforming, and consolidating data from structured and unstructured data systems suitable for analytics solutions Experience with Data Lake development and design Define AWS architecture for implementing cloud based big data using EMR,S3,Lambda and Redshift Define real time data ingestion using Kinesis,Kafka Knowledge of AWS products and servicesTECHNICAL SKILLSAWS Tools/BigDataDMS,SCT,RDS,Kinesis,Redshift,EMR,S3,Hive,Hadoop,Spark,DynamoDB(NoSQL),SNS,VP C,Lambda,Glue,Kafka,Python, Docker, Kubernetes, AWS-EKS, DevOps, CI/CD Pipelining,Dynamo DB,databricks,NoSQLData warehousing Informatica Power Center PHONE NUMBER AVAILABLE.1, Ab initio Co>os 2.12,2.13,2.11, GDE 1.12,1.13, SAS 6.0, Business Objects XI/R2,Big Data,SAP BODS,LSMW,LTMC Scheduling Tools Autosys, CTRL MCodeManagementCVS, PVCS, Visual Source Safe, CM Synergy, Clear Case, VSS, Git, Jenkins WEB tools Perl, COLD FUSION 4.0,ORACLE SOA SUITE 11g RDBMS Oracle 9i, Oracle 10g,Oracle 12c, Oracle 8i( 8.1.6 / 8.1.7), Oracle 8.0.5/ 8.0.4/12c, ORACLE 7.3 / 7.2, Sybase 12.5, Teradata 2.0, SQL Server 2000/2005,Netezza OracleApplications/ERPOracle Application 10.7,11.0.3,11I, SAP CRM 5.0, SAP BW 7.0,SAP FICO,SAP COPA, SalesforceDesigner Tools Designer/2000, Erwin, VISIO, UMLDevelopmentToolsDeveloper / 2000 (Forms 4.5, Reports 2.5,Graphics 2.5), Forms 3.0 DBA Tools Server manager, DB Artisan, TOAD, SQLDBA, Oracle Enterprise Manager Languages C, C++, COBOL, SQL, PL/SQL, Pro *C, HTML, JAVA, Unix Shell Programming, Java 1.2, C++, XML, T-SQL,BPEL, XQuery, XPath, XSLT,Python 3.7,Selenium Other Tools Kintana, Truecomp, Documentum, MS Project, TOAD, Clear Case, Visio, Remedy, Win Runner, Load Runner, Test Director,QTP,Tririga,SSIS,SSRS,PRISM,FEDBIZOPPS,FPDS,DOORS,Sunflower,Collibra,Machine Learning,Blue Prism(RPA),Splunk, Agile, SCRUM, Kanban, Waterfall, CI/CD Pipelining, Microservices, Azure Data Factory, Snowflake,Power BI WORK HISTORYL3 Harris Principal JAN 2023  PRESENT Created landscaping document by analysing all the ERP systems used in every site Created roadmap for data migration for SAP S4/Hana and PLM Teamcenter implementation Created data migration strategy for migration from Costpoint to SAP S4/Hana Defined Data cleansing include and exclude list for the required sites and workstreams Established MDM and Data Governance process Designed enterprise data warehouse platform to integrate data between different sites Designed and developed data pipelines to extract and load data to Snowflake from SAP BW Developed various Power BI reports for time and labour Worked on a common data model between different workstreams to support Reports/Analytics Created Cutover plan and coordinated cutover activities for different mocks as part of data migration Veteran Affairs Data Architect JUN 2022 DEC 2022 Led On Prem to Cloud migration for VA GI Billing System Created Data migration Strategy Analysed current billing system and provided tool recommendation Created data mapping and implementation document Executed data migration batch jobs and data reconciliation between source and target systems Designed ETL processCGI Senior Consultant NOV 2014 MAY 2022 Oversaw the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality. Develops and owns the Master Data Management Strategy and Roadmap for consumer data Works closely with Digital IT and Group Data Management for the execution and operation of consumer MDM Created logical data models and translate into physical database structures Supported the current Production for HCD Core Data warehouse Supported POCs using Machine Learning Designed and Developed ETL process to incorporate new sources to the system as part of Analyzed Hadoop clusters using big data analytic tools including Pig, Hive, and MapReduce Developed Oozie workflow to automate the loading of data into HDFS and Pig for data pre-processing Leveraged Sqoop to import data from RDBMS into HDFS Performed cleaning and filtering on imported data using Hive and MapReduce Perform big data processing using Hadoop, MapReduce, Sqoop, Oozie Implemented Spark using Python and Spark SQL for faster processing of data Imported data from AWS S3 into Spark RDD and performed transformations and actions on RDDs Developed Spark jobs to parse JSON or XML data Involved in evaluation and analysis of Hadoop cluster and different big data analytic tools including Pig, Hbase database and sqoop Automated creation of IT Service Request Designed and developed automation scripts Designed and Developed RPA solutions using Blue Prism Supported AWS Database Migration Designed and developed Multifamily Data Lake Define AWS Architecture Integrate On Cloud database with On Prem systems Replaced Hortonworks with AWS EMR Created glue jobs to provide data to On Prem downstream systems Moving legacy systems to Cloud based architecture MARATHON TS Data Migration Lead DEC 2014 AUG2015This project is for the implementation of the PRISM in FEMA by replacing the existing Electronic Procurement System(EPS) Worked on Data Migration from EPS system to PRISM Created the Data Migration Strategy Created the Data Migration Requirements and Data Migration Design documents Designed and Developed the ETL Process Migrated the Contract and Requisition Modules from EPS system to PRISM product Did the analysis of the source EPS system both by analysing the data and the EPS front end Designed the schema for the Data MigrationGATE GOURMET DW Architect JUN 2014 SEP 2014This project is for integrating different SAP sources for Planning and Budgeting purposes by replacing the current manual process for Income Statement, Balance Sheet and Cash Flow in all the divisions of Gate Group Design ETL process from various SAP sources having very high volume of data Analyzed different strategies to get huge volumes of data from SAP Created Informatica mapping for integrating SAP COPA,SAP FICO Designed and developed error handling process to check against MDM data Assisted in developing MDM strategies by analyzing data sources and data acquisition process COMPUSEARCH SYSTEMS Data Migration Lead OCT 2012 APR 2014 Worked on Data Migration from EPS system to PRISM Created the Data Migration Strategy Created the Data Migration Requirements and Data Migration Design documents Migrated the Contract and Requisition Modules from EPS system built by DSI systems to PRISM product Did the analysis of the source EPS system both by analyzing the data and the EPS front end Involved in the design and support of Middleware integration ENVIRONMENTAL PROTECTION AGENCY Data Architect FEB 2011  SEP 2012 Worked on Data Migration from legacy systems to Oracle Created Data Model for different applications Designed and implemented the database system for the new applications Developed Oracle stored procedures,packages for various modules Tuned the performance of the system as and when required Analyzed auditing requirements, audit trail sizing, and retention criteria and implemented auditing to meet NIST and FIPS140 compliance standards for federal agencies. Managed oracle database security using available options Automated ETL process, server monitoring processes with Power shell scripting Migrated data from Tririga to Facility Management System Application FANNIEMAE Technical Lead OCT 2010 JAN2011This project is called Total Return Infrastructure(TRI).It was developed to meet the corporations need for timely portfolio evaluation and risk measurement Designed and developed an interface between Capital Markets Data Store and Capital Markets BI Worked with with multi-terabyte data sets Designed, implemented, automated, and managed configuration management processes and associated documentation. Provided guidance on quality issue resolutions and production issues DEPARTMENT OF LABOR(DOL) Technical Lead APR 2010SEP 2010This project is called OIS.OSHA(Occupational Safety and Health Administration) Information System consists of suite of applications for OSHAs business processes including enforcement,compliance,and consultation. Created logical data models and translate into physical database structures that integrate with existing or proposed database structures for OLTP Cleaned and maintained the database by removing and deleting old data. Involved in the Data Migration from Legacy to OIS Created and maintained Data Dictionary Addressed data problems using conceptual, logical, and physical data models.Offered supervision on data management, migration Created Dimensional Model for Reporting Created OLAP CubesHousing & Urban Development(HUD) Data Architect APR 2009 MAR 2010This project is called EIV. EIV provides a portal to tenant income information in the form of household income data, as well as several income-based reports. EIV is a Web-based system, allowing secure access to information via the Internet using standard browsers. EIV project was migrated from SQL Server to Oracle. Created and modified batch jobs for Debt Termination,SSA Verification,NDNH. Planned, developed and implemented SQL Server to Oracle migration. Developed Oracle stored procedures for various modules. Designed, developed, and implemented Data migration to different environments. Tuned the performance of the system using various tuning techniques. Setup database environment for migration from SQL Server to Oracle. Created Dimensional Model for Reporting Application FANNIE MAE Technical Lead AUG 2004 FEB2009This project is called HCD Core.The architecture is designed to collect data from approximately thirty different sources varying from MS Access, Sybase, Oracle, and Excel Spreadsheet. This project is designed for Multi Family business side of FannieMae.Debt, Equity, Bond are the three main areas for which data is collected and is used by Management for various financial analysis. Collected the business requirement for integrating different new sources for Equity, Bond and designed the process flow depending on business logic Did development and deployment of Ab Initio and non Ab Initio components from the lower environments to Production and prepared the task list for the Production Deployment by coordinating with different teams involved in the release. Established processes for governing the identification, collection, and use of corporate metadata; take steps to assure metadata accuracy and validity Designed and developed ETL process for IDB project by integrating new sources and enhancing the existing transformation as per the requirement. Oversaw the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality. Managed virtual, cross-functional teams, including defining requirements Communicated at different levels within the organization about the advantages and disadvantages of using various technologies and standards Created logical data models and translate into physical database structures that integrate with existing or proposed database structures. Leading the development of Data Warehouse Project and tracking of Project Plans. Developed and implemented key components as needed to create testing criteria in order to guarantee the the accuracy and performance of data architecture. SPRINT Technical Lead APR 2004 JUL2004The proposed solution is to implement the new PM reporting system architecture ( Option 4-Sprint ISSD implements new system, but transfers maintenance of business rules and Crystal Reporting to SPAR). All business rules currently in the various source systems will be consolidated into a tool that will facilitate easy manipulation and management of the business rules by the SPAR organization. Metadata mapping from legacy source system to target database fields and involved in creating Ab Initio DMLs. Analyzed the current process and recommended steps leading to Performance enhancement and streamlining of the processes and created the Design documents, Production readiness documents. Evaluated number of Ab Initio Graphs and suggested performance tuning tips based on business requirements.ALLSTATE Technical Lead JAN 2004 MAR2004 Went through various data integration scripts and did performance tuning to achieve the best results in stress testing environment. Worked on the migration of the existing commissions system to Truecomp. Formalized a general plan of action for performance tuning of Data Integration module. Created the architecture for purging process Designed a process for performance monitoring Involved in running and monitoring the pipeline run process in TrueComp SPRINT/SPRINT PCS Technical Lead JUN 2002 DEC2003This project was initiated to implement Truecomp for Sprint PCS for their existing Commissions System Did the development and deployment of Ab Initio and non Ab Initio components from the lower environments to Production Wrote Unix Scripts for Job Scheduling/Monitoring, Data Massaging and developed Oracle Stored Procedures & packages using PL/SQL depending on the business requirement. Developed Abinitio graphs as per the business specifications. Involved in defining ETL process for Commissions system. Developed mapping using Informatica to retrieve data from various sources including mainframe Worked on the migration of the existing commissions system to Truecomp. Defined the compensation rules in TruecompCISCO SYSTEMS Sr ETL Developer AUG 2000
APR 2002 Created, tuned and tested stored procedures, functions and packages for implementing major functions of the system. Developed Extract, Transform, Load (ETL) processes of data warehouse. Wrote PL/SQL stored package for data cleansing and reporting the errors. Worked on Informatica tool- Source Analyzer, Warehouse Designer, Mapping. Designer, Transformations, Informatica Repository Manager and Informatica Server Manager. ACCENTURE DBA JUN 1999
JUL 2000 Performance tuning of the system and designed a plan to prepare the environments for stress testing. Analyzed the sources and targets and the existing mapping Wrote Unix Scripts for Job Scheduling/Monitoring, Data Massaging and Developed Oracle Stored Procedures & packages depending on the business requirement Creating Data structure like table spaces, adding data files, creating table segments, index segments and their maintenancePEROT SYSTEMS DBA SEP 1997
MAY 1999Odyssey is a car Rental System being developed for National Car Rental one of the leading car renting companies in U.S having offices/branches all over the country Created Users and their Roles, grant the appropriate Roles. Creation of the Database, Creation of user accounts and Privilege Management, Tablespace management, Role Management. Monitored and Tuned the Database and Application Performance. Extensive scripting (Unix shell / PLSQL / SQL) for automating database maintenance. Creating, tuning and testing stored procedures function and packages. BCSS SOFTWARE ENGINEER AUG 1996
AUG 1997 Analysis, design and preparation of User Requirement Specifications (URS), Functional Specification Document FSD), Program Specifications Written several stored procedures and database triggers at the backend to implement Business Designed data input screens using Developer 2000 features like List of values, Record Groups, Property classes, Visual Attributes and Program units.WORK AUTHORIZATIONUS CitizenEDUCATIONMaster of Computer Applications (MCA) September 1995 NATIONAL INSTITUTE OF TECHNOLOGY, RAIPUR, INDIAMaster of Science (Physics), Ravenshaw College,INDIA CERTIFICATIONProject Management Professional (PMP), Dec 2007AWS Certified Solutions Architect Associate, March 2022

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise