A PHP Error was encountered

Severity: Notice

Message: MemcachePool::set(): Server 127.0.0.1 (tcp 11211, udp 0) failed with: Connection refused (111)

Filename: drivers/Cache_memcached.php

Line Number: 171

Backtrace:

File: /home/theinjobs/public_html/application/controllers/Indeed.php
Line: 780
Function: save

File: /home/theinjobs/public_html/application/controllers/Indeed.php
Line: 863
Function: indeed_job_details

File: /home/theinjobs/public_html/index.php
Line: 316
Function: require_once

ETL Developer Job In DISH At Englewood, CO

ETL Developer Details

DISH - Englewood, CO

Employment Type : Full-Time

DISH is a Fortune 250 company with more than $14 billion in annual revenue that continues to redefine the communications industry. Our legacy is innovation and a willingness to challenge the status quo, including reinventing ourselves. We disrupted the pay-TV industry in the mid-90s with the launch of the DISH satellite TV service, taking on some of the largest U.S. corporations in the process, and grew to be the fourth-largest pay-TV provider. We are doing it again with the first live, internet-delivered TV service – Sling TV – that bucks traditional pay-TV norms and gives consumers a truly new way to access and watch television.
Now we have our sights set on upending the wireless industry and unseating the entrenched incumbent carriers.
We are driven by curiosity, pride, adventure, and a desire to win – it’s in our DNA. We’re looking for people with boundless energy, intelligence, and an overwhelming need to achieve to join our team as we embark on the next chapter of our story.
Opportunity is here. We are DISH. #LI-HT1
Job Duties and Responsibilities
The DISH Wireless organization is looking for an ETL Developer to deliver large, complex data migration projects. In this role, you must have the ability to work in a fast-paced and quickly evolving environment where initial requirements are vague and evolving. The ability to quickly understand the project goals and impacted systems and to independently seek out subject matter experts to define requirements and devise solutions will be critical.
Primary responsibilities fall into the following areas:


  • Work with various business and technical people to understand the systems, data sources, business goals, technical goals, timelines, and financial impacts of multiple data conversion projects.
  • Perform deep analysis of the data in source and target systems in order to document detailed data mapping requirements. This will involve analyzing data quality, data availability, and defining data transformations.
  • Work on building data ingestion pipelines to ingest data from API, SFTP and Streaming (Kafka/Kinesis). Focus on creating reusable components to be leveraged on future conversions.
  • Develop detailed metrics related to all data migrations to report conversion progress and error rates.
  • Work well in a team environment where knowledge and reusable components are proactively shared.
  • Utilize procedural programming language or scripting language skills to code the required data transformations.

Job Duties and Responsibilities
A successful ETL Developer will have the following:

  • 5+ years of experience delivering data solutions on a variety of data warehousing, big data, and cloud data platforms.
  • 3+ years of experience working with distributed data technologies (e.g. Spark, Kafka, Flink, etc) for building efficient, large-scale ‘big data’ pipelines;
  • Strong Software Engineering experience with proficiency in at least one of the following programming languages: Spark, Python, Scala or equivalent
  • Experience with building data ingestion pipelines both real-time and batch using best practices
  • Experience with building streaming ingestion pipeline using Kafka streams, Apache Flink, or others
  • Experience with Cloud Computing platforms like Amazon AWS, Google Cloud, etc.
  • Experience supporting and working with cross-functional teams in a dynamic environment
  • Experience with relational SQL and NoSQL databases, including Postgres, and MongoDB.
  • Experience with change data capture tools (CDC) preferred such as Attunity/goldengate
  • Experience with scheduling tools preferrable Control-M, Airflow, or AWS Step functions.
  • Strong interpersonal, analytical, problem-solving, influencing, prioritization, decision- making and conflict resolution skills
  • Excellent written/verbal communication skills
  • Strong organizational skills and the ability to communicate effectively (both in writing and verbally) to interact effectively with IT, business, and leadership is required. Strong collaboration and facilitation skills are extremely helpful
  • Bachelor’s Degree in Computer Science, Information Systems Management, Engineering, or BS in a related technical field, or equivalent amount of education and experience
Compensation: $85,905.00/Yr. - $136,190.00/Yr.
Benefits
From versatile health perks to new career opportunities, check out our benefits on our careers website.
Employment is contingent on Successful completion of a pre-employment screen, which may include a drug test.

Posted on : 3 years ago