Description
Summary of This Role
The primary responsibility of the Senior Software Engineer is to develop and test data pipelines from extract to consumption layer population for the GPN Lakehouse. The ETL developer performs tasks connected with data analytics, testing, and system architecture to provide data pipelines to enable business solutions. The developer will be expected to perform at a minimum the following tasks. ETL process management, Data Modelling , Data Warehouse/Lake Architecture , ETL tool implementation, data pipeline development, and ETL unit testing.
What Part Will You Play?
Independently develop and deploy ETL jobs in a fast-paced object oriented environment Capable enough to understand and Receive business requirements from clients via a Business Analyst/architect/development lead to successfully develop applications, functions, and processes.
Conducts and is accountable for unit testing on development assignments.
Must be detail-oriented with ability to follow-through on issues.
Must be able to work on and manage multiple tasks in addition to working with other areas within the department.
Utilizes numerous sources to obtain and build development skills.
Enhances existing applications to meet the needs of ongoing efforts within software platforms.
Records and tracks time worked on projects and assignments.
Develops a general understanding of TSYS/Global Payments, software platforms, and the credit card industry.
Participates in team, department, and division meetings as required. Performs other duties as assigned.
What Are We Looking For in This Role?
5 to 8 years of strong development background in ETL tools like GCP-Data Flow , Pyspark , SSIS
Experience in Google cloud platform - GCP Pub/Sub, Datastore, BigQuery, AppEngine, Compute Engine, Cloud SQL, Memory Store, Redis etc
Experience in GCP/AWS/SNOWFLAKE/AZURE is preferred
Proficient in Java, Python , Pyspark, SQL
Proficient in GCP-Big Query , Composer , AirFlow , Pub Sub , Cloud storage
Experience in building tools (e.g., Maven, Gradle etc.)
Proficient in Code repo management, branching strategy, Version controlling using GIT, VSTS & Teamforge etc
Developing an application using Eclipse IDE or IntelliJ
Excellent knowledge of Relational Databases, SQL & JDBC drivers
Experience with API Gateways - Datapower, APIM , Apigee etc
Strong analytical, planning, and organizational skills with an ability to manage competing demand
Excellent communication skills, verbal and written; should be able to collaborate across business teams (stakeholders) and other technology groups as needed.
Experience in NO-SQL databases is preferred
Exposure to Payments industry is a plus
Minimum Qualifications
Bachelor's Degree - Quantitative Analytics, Statistics, Mathematics, Data Science, or similar discipline
Minimum 5 to 8 years of relevant experience.
Software Engineering, Payment Information Systems or any Technical degree; additional experience in lieu of degree will be considered
Preferred Qualifications
Prefer experience with Cloud Data Lakehouse
Master's Degree - Quantitative Analytics, Statistics, Mathematics, Data Science, or similar discipline
What Are Our Desired Skills and Capabilities?
Skills / Knowledge - A seasoned, experienced professional with a full understanding of area of specialization; resolves a wide range of issues in creative ways. This job is the fully qualified, career-oriented, journey-level position.
Job Complexity - Works on problems of diverse scope where analysis of data requires evaluation of identifiable factors. Demonstrates good judgment in selecting methods and techniques for obtaining solutions. Networks with senior internal and external personnel in own area of expertise.
Supervision - Normally receives little instruction on day-to-day work, general instructions on new assignments.