The dx Team has responsibility including Data Engineering in DevOps Model for Comcast; one of the major goal is to harmonize the data ingestion and consumption layer across Comcast. Creating enterprise data sources as a single version of truth is a goal of dx Team.
The Big Data Software Developer will develop (code/program), test, debug -ETL (Extract/Transform/Load) of data to answer technically challenging business requirements (complex transformations, high data volume).
Employees at all levels are expect to:
- Understand our Operating Principles; make them the guidelines for how you do your job
- Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services
- Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences
- Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers
- Win as a team - make big things happen by working together and being open to new ideas
- Drive results and growth
- Respect and promote inclusion and diversity
- Do what's right for each other, our customers, investors and our communities
- Analyzes and determines data integration needs
- Evaluates and plans software designs, test results and technical manuals using Big Data (Hadoop) ecosystem
- Reviews literature, current practices relevant to the solution of assigned projects in Data Warehousing and Reporting areas.
- Experience with DevOps tools (GitHub, Jira) and methodologies (Agile, Scrum, Kanban, Test Driven Development)
- Hands on experience with Hadoop ecosystem tools like Spark, YARN, HDFS, Hive, Sqoop, understanding of systems performance data (collection, monitoring, analysis)
- Experience with various Teradata tools such as BTEQ, MLOAD, FASTLOAD, FAST EXPORT, TPT in large Data Warehouse environments
- Exposure to data integration and storage in AWS - S3, Lambda, Glue Crawlers, Data Pipelines
- Exposure to data loads in Databricks
- Programs new software using Spark, Scala, Kafka, Sqoop, SQL,
- Deep knowledge of SQL and data sourcing technologies such as Informatica.
- In-depth hands on experience in ETL design and development
- Monitor job performances, file system/disk-space management, cluster and database connectivity, log files, management of backup/security and troubleshooting various user issues
- Create and execute capacity planning strategy process for the Hadoop platform
- Edits and reviews technical requirements documentation
- Displays knowledge of software engineering methodologies, concepts, skills and their application in the area of specified engineering specialty (like Data warehousing)
- Displays knowledge of, and ability to apply, process software design and redesign skills
- Displays in-depth knowledge of, and ability to apply, project management skills
- Works independently, assumes responsibility for job development and training, researches and resolves questions and problems, requests supervisor input and keeps supervisor informed required
- Consistent exercise of independent judgment and discretion in matters of significance
- Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) as necessary
- Other duties and responsibilities as assigned
Disclaimer The above information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is an EEO/AA/Drug Free Workplace. Comcast is an equal employment employer.