Skip to content

Data Engineering Manager

India, Karnātaka, BengaluruData Analytics

Job description

About CoinDCX

Trusted by more than 1.3 crore Indians, CoinDCX is one of the largest players in India's crypto ecosystem. Our vision is to create a more open and accessible future and we believe that web 3 will play an important part. We’ve already ventured into Web 3 in 2021 by investing in promising Indian Web 3 startups via CoinDCX Ventures, launching Okto - a Defi app to access thousands of tokens on multiple DEXs across chains. We learn and build something new CoinDCX every time.


The Web 3 space is still new and we’re just getting started!


Inside CoinDCX’s Data Analytics Team

Our Data Analytics team is an awesome group of collaborators, who love to solve first of its kind problems with a lot of autonomy, creativity and fun.

As a team they fulfil the needs, wants and desires of our patrons by finding the key levers that enable them to use our platforms better. You help them do more.

At CoinDCX you not only will be the skill of the future but also you will get to work and learn from the best while building the future of Web3.

Coin your trust in us as we create magic together!


Who you are 

  • You’re passionate about everything Crypto and Web3.0

  • You take ownership and have a thirst for excellence with an impact driven and result oriented mindset.

  • You grow while helping others grow with you

  • You thrive on change, have attention to detail and passion for quality

  • You love exploring new ideas to build something useful and are always curious to learn.


What you’ll do

  • Hire the best talents, identify the potential of each team member and offer technical guidance, leadership, and advice on planning, designing, and implementing data solutions

  • Act as a project manager for data projects, mentor and grow the team by hiring skilled data engineers

  • Ensure data quality and security across every product vertical and related areas.(Data Sharing/Compliance)

  • Mentor and grow data warehouse, and data modeling team to properly establish a data-driven culture.

  • Design and build an infrastructure for extraction, transformation, and loading of data from a wide range of data sources

  • Overall build and maintain data foundations that include tools, infrastructure, and pipelines that help the marketing and sales team.Well-versed experience in all phases of the Data Management Life cycle.

  • Expericened in data lakehouse architecture and warehouse architecture.

  • Design & Build reliable, scalable, CICD driven streaming and batch data engineering pipelines.

  • Work in collaboration with Data scientists, ML engineers, Stakeholders to build a platform for enabling data-driven decisions.

  • Create a taxonomy/data dictionary to communicate data requirements that are important to business stakeholders work on acquiring external data sets through APIs and/or Websockets and prepare physical data models on top of that

  • Create detailed logical models using business intelligence logic by identifying all the entities, attributes, and their relationships


Job requirements

What you’ll bring


  • 12+ years of Experience in building Data Engineering pipelines & Data Governance using modern Cloud Architecture

  • Working knowledge of project management methodologies and the ability to guide and grow a team, Preferrebly Agile.

  • Experience in managing engineers and guiding them through project planning, design, development, quality control, and deployment

  • Experience in offering technical leadership and guiding teams for data engineering best practices

  • Optimize data models and conduct performance engineering for large scale data

  • Knowledge of data visualizations and self-service data preparation tools to maintain the flow of data for the analytics and visualization teams

  • Experience in leading a data engineering team for early stage product based startups is a must-have

  • Proficient in Databricks, Spark, Data Lake, Kaka/Kinesis

  • Experience in Any Cloud DW Redshift/Snowflake/BigQuery/Synapse

  • Experts Programming in Any one - Python/Scala/Java (Python preferred)

  • Design, Test-driven development, code review and implement CICD using Github/Gitlab/Docker

  • Good understanding of ETL/ELT technology and processes

  • Basic knowledge of Apache Airflow would be a plus.



What’s in it for you


  • Unlimited Wellness Leaves

  • Personalised Mental Wellness & Caregiving sessions by Experts

  • Recharge and Rejuvenate through team outings

  • DYOB - Design your Own Benefit

  • Linkedin Learning

India, Karnātaka, Bengaluru
Data Analytics

or