At Dell Technologies we are enabling the next data decade for our businesses and customers. With some of the biggest data implementations in the industry, the Dell Digital Data team is constantly reimagining and innovating the way data is consumed and enabled.
In this role we are looking for individuals who are passionate about data and the possibilities of leveraging it to power intelligent, insightful and impactful outcomes. You will be working with some of the best minds and talents in the data space including business stakeholders, architects, application teams, privacy, security, governance etc. We believe in empowering our teams and leaders to take end to end ownership of the outcomes with the highest focus on quality, stability, security and customer satisfaction.
Design and develop analytical design for business problem statement and progress the solution from design though the software development lifecycle to implementation using various analytics tools and techniques.
Responsibilities
- Analyzes business needs and creates software and hardware solution blueprints.
- Work closely with Data Product Managers and Solution Architects to define use cases and measurable business metrics
- Works with engineering teams to check the feasibility of the solution, build stories and architects the solution for the Projects. Drives use cases through complete lifecycle.
- Prepares flow charts, systems diagrams and design documentation to assist in development or problem analysis.
- Designs, codes, tests and debugs software according to Dell’s standards, policies and procedures.
- Mentoring junior team members on technical and functional skills. Should be a great team player. Functional knowledge of business processes is required.
- Possesses and applies a broad knowledge of application programming processes and procedures to the completion of complex assignments.
- Competent to analyze diverse and complex problems.
- Leads large budget projects.
- Advanced ability to effectively troubleshoot program errors.
- Build high reliability, high quality, high volume data pipelines
- Setup batch, micro batch, streaming pipelines
- Data ingestion, transformation, processing – batch & near real time
- Automated tests and tie outs, self-healing data jobs
- Build products that can support themselves with none to minimal support after rollout
- Ability to communicate complex insights in a precise and actionable manner
- Mindset to think differently; alignment to Industry standards; awareness of emerging technologies and industry trends
- Demonstrated experience in creating and presenting technical white papers
Requirements
- 5- 8 years of relevant IT experience in Data-Warehousing Technologies with excellent communication and Analytical skills
- Understanding of Big Data technologies
- Should possess the below skillset experience –
- Comfortable with ETL concepts
- Experience working with Teradata/Oracle/SQL Server/Greenplum as data warehouse databases
- Very strong in SQL (DDL, DML, procedural)
- Should know Unix and Shell Scripting
- Knowledge of technologies like Spark and Hadoop HDFS, Hive, Hbase
- Hands on experience on change capture and ingestion tools – StreamSets, Informatica
- Knowledge on Kafka and Oracle GoldenGate
- Excellent knowledge in scheduling tools like Control-M
- Strong experience in Source code repositories like Git, SVN and Jenkins
- Working knowledge of NRT and associated Tech stack -Spark, MemSQL
- Understand E2E Dev and Test process
- Understands Data Architecture, data profiling, data quality
- Adheres to standards, Audits and Tie Outs
- Excellent analytical and problem-solving skills is a must have
- Strong communication and presentation skills
- Must be experienced in diverse industry and tools and data warehousing technologies
- Bachelor of Engineering or Master of Computer Applications
- Experience in working in Agile (SCRUM) Methodology
Job ID: R078485