Cryptocurrency Traders are looking for a strong global platform that can help them trade faster, better and smarter. We have a platform that can solve their needs. As the users scale, so does the amount of data we collect. This data is rich with insights and requires constant modelling and monitoring to understand our users better. Now there's a gap we want YOU to address.
You fulfill the needs, wants and desires of our patrons by finding the key levers that enable them to use our platforms better. You help them do more. As a Business Analytics Engineer, you ensure all of the above and more. You make a difference!
Make a difference by:
Design & Build reliable, scalable, CICD driven streaming and batch data engineering pipelines.
Work in collaboration with Data scientists, ML engineers, Stakeholders to build a platform for enabling data-driven decisions.
Oversee and govern the expansion of the current data architecture and the optimization of query and data warehouse.
Create a conceptual data model to identify key business entities and visualize their relationships
Create detailed logical models using business intelligence logic by identifying all the entities, attributes, and their relationships
Create a taxonomy/data dictionary to communicate data requirements that are important to business stakeholders work on acquiring external data sets through APIs and/or Websockets and prepare physical data models on top of that
Acts as team lead stay current with new and evolving tech stack. Guide and mentor team of Data Engineers.
4 to 8 years of Experience in building Data Engineering pipelines & Data Governance using modern Cloud Architecture.
Proficient in Databricks, Spark, Data Lake, Kaka/Kinesis
Experience in Any Cloud DW Redshift/Snowflake/BigQuery/Synapse
Experts Programming in Any one - Python/Scala/Java (Python preferred)
Design, Test-driven development, code review and implement CICD using Github/Gitlab/Docker
Good understanding of ETL/ELT technology and processes
Basic knowledge of Apache Airflow would be a plus
Basic knowledge of Data Modeling tools (dbt, dataform, Alteryx, Informatic,etc) would be a plus
Experience in ML ops and tools for Model Reproducibility, Deployment, packaging,
monitoring and Model retraining.
Experience in Lakehouse Architecture using Databricks.