4-8 Years
mumbai
Salary : As Per Industry Standards.
Openings: 1
Posted: 08-04-2022
Job Views: 798
Job Details
Job Description:
A systems-level approach; you have worked across the entire stack up to the APIs, from the OS all the
way up to the application layer.
Advanced Proficiency with SQL (primarily T-SQL and Legacy SQL)
Developing and orchestrating large ETLs.
Comfortable with multiple languages, like R, Python to supplement SQL data extraction and preparation
A mind for scale; you are curious about building distributed systems, and often ask the question, “but
does it scale?”.
Assist and Collaborate in the design, construction, testing, delivery, and maintenance of complex data
pipelines
A passion for troubleshooting and finding long term solutions; you do not accept the easy solution as the
only solution and will dig to ensure that we put the long-term benefit of our merchants and stakeholders
first.
Well-founded opinions about writing code and approaching problems; comfortable with automated
testing, code refactoring, and software engineering best practices.
Experience with designing and deploying data-warehouses and data-marts.
Excited to work with a remote team; you value collaborating on problems, asking questions, delivering
feedback, and supporting others in their goals whether they are in your vicinity or entire cities apart.
Requirements:-
Creating and deploying infrastructure in the form of data marts to handle multiple streams of structured
and unstructured data that will feed into our core product domains
Deliver further automation in our key processes, starting from data extraction/entry all the way through
consumption.
Developing configuration management and automation tools.
Designing data quality and integrity monitoring capabilities to ensure data is fit for consumption.
Building a world-class data analytics platform to help both internal and external customers, focusing on
improving the effectiveness of shipping
It’d Be Nice If You Have Experience With:-
Working with data at terabyte scale.
Any Cloud Platform (GCP, AWS or Azure): Compute instances, deployment tooling, storage, networking,
etc.
Experience with basic machine learning algorithms
Basic understanding of Statistical Methods and their use in Data Wrangling
City: mumbai
State: Maharashtra
PostalCode: 230532
Recruiter: Parth Parmar - +91 95107 15429
Created Date: 08-04-2022
Desired Skills: It’d Be Nice If You Have Experience With:-
Working with data at terabyte scale.
Any Cloud Platform (GCP, AWS or Azure): Compute instances, deployment tooling, storage, networking,
etc.
Experience with basic machine learning algorithms
Basic understanding of Statistical Methods and their use in Data Wrangling
Responsibilities: A systems-level approach; you have worked across the entire stack up to the APIs, from the OS all the
way up to the application layer.
Advanced Proficiency with SQL (primarily T-SQL and Legacy SQL)
Developing and orchestrating large ETLs.
Comfortable with multiple languages, like R, Python to supplement SQL data extraction and preparation
Experience Requirements:Creating and deploying infrastructure in the form of data marts to handle multiple streams of structuredand unstructured data that will feed into our core product domains Deliver further automation in our key processes, starting from data extraction/entry all the way throughconsumption.
Industry: it software
Salary Range: As Per Industry Standards.
Openings: 1