Eng 3, Prodt Dev Engineering
As part of the Big Data ecosystem, the Comcast dx team defines and executes on data strategy in realizing the promise of "data to the people". The Solution Engineer plays a critical role in this effort by linking our customers' needs to the data ecosystem, both within the dx team as well as the larger Comcast organization.
The Solution Engineer partners closely with Engagement Leads, Solution Architects, DevOps engineers, and data producers. As a Solution Engineer, you operate across multiple technical domains from platforms to software, raw metrics to end-user analysis, as well as reporting. You are naturally curious and a problem solver who excels at organization, communication, efficiency, and execution. You have performed well in an analytical or technical role previously, and have a strong understanding of the analytical workflow from requirements definition, to data discovery, through analysis, to operationalization. You understand that to be successful in this role requires a high degree of technical competence, initiative, presence, and confidence. You embrace the opportunity to drive a team forward in the face of ambiguity and competing priorities.
This position requires an understanding of the fundamental role that data plays in the competitive landscape, as well as a demonstrated passion for data quality. You embrace collaboration as a central tenant to being successful, and understand the critical need to build rapport with key stakeholders and delivery teams.
-Gather and elicit requirements for large, revenue-impacting engagements.
Map requirements to platform capabilities by collaborating with customers to define platform deliverables, and understand the work flow, onboarding process, and turn-up process.
-Onboard customers to data platforms as necessary by establishing data streams, data transfer processes, compute resources, and storage.
-Do data discovery such as research, data dictionary reviews, validation, and data definitions, producing relevant analysis artifacts as-needed.
-Partner with data SMEs and contribute to Avro schema development, review, and deployment.
-Meet with Product Owners regularly to review present platform capabilities and roadmap.
-Apply working knowledge of underlying technologies, such as Kafka, AWS, and Spark, to assist with tasks as-needed.
-Inspect and validate platform output with customers. Provide ongoing support as necessary.
-Participate in daily scrum as well as other status meetings to review current engagements, mitigate impediments, and adjust priorities.
-Some travel may be required.
-Create solution-level documentation such as Data Pipeline Descriptions (DPDs), Data Requirements Documents (DRDs), and Engineering Analysis (EA) artifacts.
-Do ad-hoc analysis of data on a variety of platforms, including Kafka, S3, Kinesis, Oracle, and HDFS.
-Coordinate with Project Management and Site Reliability Engineering (SRE) teams on platform deliverables, and assist with the creation of platform artifacts as-needed.
-Collaborate with Architecture and Data Platform teams to solve difficult Big Data problems and achieve company objectives.
-Document results of applied work processes and practical application of technical standards.
-Adopt best practices as defined by the wider data team and collaborate with Delivery Owners to capture and report results to stakeholders.
-Regular, consistent, and punctual attendance.
-Must be able to work nights and weekends, variable schedule(s) as necessary.
-Other duties and responsibilities as-assigned.
-Minimum 8 years' experience at a technology, financial, research, or analysis organization, working with large data sets and multivariate analysis.
-Bachelor's Degree in engineering, mathematics, computer science, statistics, physics, economics, operations research, or a related field. Advanced degree or post-graduate work preferred.
-Coursework in advanced mathematics, computer programming, and statistics desired. Background in statistical analysis and/or engineering is a huge plus.
-Demonstrated experience authoring complex technical specifications and illustrating system-level diagrams.
-Demonstrated understanding of software architecture principles, such as scale, quality, and maintainability, with ability to apply desired architecture characteristics to Big Data solutions.
-Demonstrated quantitative aptitude including forecasting and a well-developed analytical skill set.
-Intermediate-level proficiency with Linux and Bash. Exposure to Perl, Python, REST, JSON, Scala, Java, R, Kafka, Spark, Zeppelin, Databricks, and/or AWS is a huge plus.
-Expert in at least one industry-standard diagramming tool, such as Visio, OmniGraffle, Draw.io, or Gliffy.
-Proficient with SQL, preferably with experience in advanced concepts, stored procedures, and database object models.
-Basic understanding of the Hadoop ecosystem and HDFS, Pig, and Pentaho is a plus.
-Proficient with an industry-standard issue tracking and collaboration tool suite, such as Atlassian JIRA and Confluence, or Rally.
-Working knowledge of Github and Jenkins is a plus.
-Significant experience particularly in desktop data analysis tools, such as Excel and Tableau. Proficient with Microsoft Office.
-Experience interacting effectively with senior leadership, lead architects, and lead developers.
-Proven ability to quickly locate and analyze extremely large volumes of data and desire to work with Big Data sets.
-Interest in algorithms, state machines, and use of heuristic models for algorithm development and simulations.
-Ability to rapidly prototype and iterate on vague and quickly iterating requirements.
-Must be a team player and able to work effectively with technical teams and business users alike.
-Ability to communicate effectively with Senior Managers, Directors, and VPs across the entire enterprise.
-Capable of building strong relationships with leaders across the enterprise.
Comcast is an EOE/Veterans/Disabled/LGBT employer