Reference # 18-01274 Title Data Integration Developer - Big Data (Hadoop /ETL Experience)
Location Phoenix, ARIZONA
Position Type Direct Placement
Experience Level Direct Placement
Start Date / End Date 10-07-2018 --- 30-11--0001
Purpose of the Job
  • Designs, develops, and deploys data integration solutions on both RDMS and Big Data (Hadoop) platforms.
  • Creates and implements business intelligence and extract, transform, and load (ETL) solutions using programming, performance tuning, data modeling.
  • Creates and implements big data ingestion and prepares data for consumption and analysis (Map, Reduce, Hive, HBase)
Essential Job Functions and Responsibilities
  • Learn area's direct flow; and how it affects surrounding systems and operational areas.
  • Architect, design, construct, test, tune, deploy, and support Data Integration solutions for Hadoop and MPP solutions.
  • Work closely with Scrum team and Data Scientist to achieve company business objectives.
  • Writing Python scripts to load the data from different interfaces to Hadoop and writing Sqoop scripts to import, export and update the data to RDBMS.
  • Collaborate with other technology teams and architects to define and develop solutions.
  • Research and experiment with emerging Data Integration technologies and tools related to Big Data.
  • Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed; ensuring a high degree of data quality.
  • Assist Users/Analysts with the development MapReduce and Spark
  • Develop, write and implement processing requirements and post implementation review
  • Facilitate and/or create new procedures and processes that support advancing technologies or capabilities
  • Design & Implement Extract, Transform, and Load (ETL) solutions utilizing SSIS or TalenD
  • Apply data mining rules
  • Create logic, system, and program flows for complex systems, including interfaces and metadata
  • Write and execute unit test plans. Track and resolve any processing issues.
  • Implement and maintain operational and disaster-recovery procedures.
  • Participate in the review of code and/or systems for proper design standards, content and functionality.
  • Participate in all aspects of the Systems Development Life Cycle
  • Analyze files and map data from one system to another
  • Adhere to established source control versioning policies and procedures
  • Meet timeliness and accuracy goals.
  • Communicate status of work assignments to stakeholders and management.
  • Responsible for technical and production support documentation in accordance with department standards and industry best practices.
  • Maintain current knowledge on new developments in technology-related industries
  • Participate in corporate quality and data governance programs
  • The position requires a full-time work schedule. Full-time is defined as working at least 40 hours per week, plus any additional hours as requested or as needed to meet business requirements.
Employment Requirements
Required Work Experience
  • 4 years of experience in computer programming, query design, and databases
Required Education
  • High-School Diploma or GED in general field of study
Preferred Work Experience
  • 6+ years of experience building and managing complex Data Integration solutions.
  • 6+ years of experience with distributed, highly-scalable, multi-node environments.
Preferred Education
  • Bachelor's Degree in Information Technology or related field preferred
Preferred Certifications
  • MS SQL Certification or other certification in current programming languages
Required Job Skills
  • intermediate skill in use of office equipment, including copiers, fax machines, scanner and telephones
  • Intermediate PC proficiency in spreadsheet, database and word processing software
  • Advanced knowledge of business intelligence, programming, and data analysis software
  • Intermediate knowledge of Microsoft SQL databases.
  • Intermediate proficiency in T-SQL, NZ-SQL, PostgreSQL, data tuning, enterprise data modeling and schema change management.
  • Ingestion of data into Hadoop and proficiency with the usage of common Hadoop Tools; such as, NIFI, Hive, Pig, Oozie, HBase, Flume, Sqoop, Yarn MapReduce, Ambari, Spark, Java, Python,
  • Intermediate knowledge in Python scripting
  • Strong object-oriented design and analysis skills
  • Experience consuming, organizing and analyzing JSON and XML messages as data.
Required Professional Competencies
  • Knowledge of agile development practices
  • Strong analytical skills to support independent and effective decisions
  • Ability to prioritize tasks and work with multiple priorities, sometimes under limited time constraints.
  • Perseverance in the face of resistance or setbacks.
  • Effective interpersonal skills and ability to maintain positive working relationship with others.
  • Verbal and written communication skills and the ability to interact professionally with a diverse group, executives, managers, and subject matter experts.
  • Systems research and analysis. Ability to write and present business intelligence documentation
  • Demonstrate the ability to stay current on global threats and vulnerabilities.
  • Maintain confidentiality and privacy
Required Leadership Experience and Competencies
  • Build synergy with a diverse team in an ever-changing environment.
Preferred Job Skills
  • Advanced knowledge of Data Integration
  • Advanced proficiency with relational technologies that supplement RDBMS tool sets
  • Advanced proficiency in TalenD Open Studio or Profisee Maestro Enterprise Data Warehouse (EDW) tools.
  • Flair for data, schema, data model, how to bring efficiency in big data related life cycle
  • Minimum 1-2 Year Experience on Cloud computing, Azure preferable.
  • Experience supporting Spark-R and R
  • Proficiency with agile development practices
  • Experience collecting and storing data from Restful API's
Preferred Professional Competencies
  • Advanced systems research and analysis expertise
  • Solid technical ability and problem-solving skills
Preferred Leadership Experience and Competencies
  • Experience and knowledge in participating in a highly collaborative scrum team.
  • Experience in developing, maintaining and enhancing big data solutions.
  • Demonstrated experience and patience in mentoring and sharing knowledge with peers.