Responsibilities
- Define and implement data marts andmon data models specifically tailored to support the sales domain.
- Gain a deep understanding of end-to-end data models and workflows within the sales domain, including managing data flows for Salesforce to maximize value extraction.
- Continuously identify opportunities for cost optimization, address functional / non-functional issues through test automation,pute analysis, and performance profiling.
- Actively research and propose open-source and vendor solutions that can accelerate the achievement of business objectives.
- Maintain high standards for code quality, organization, and adherence to best practices.
- Work effectively as part of a global team spanning Hong Kong, New York, and London.
Qualifications
A pragmatic technologist with a strong sense of end-to-end ownership that thinks front and foremost about how the application of technology can accelerate delivering on business objectives.Hands-on experience with Python.Hands-on experience with BI tools such as Tableau, Power BI and Looker.Hands-on experience with Salesforce.Strong SQL skills. Strong analytical skills.Hands-on experience with SQL / no-SQL databases such as Redis, Postgres, Singlestore and MongoDB.Familiarity with big data technologies and data lake ecosystems Databricks, Snowflake etc.Well versed in designing data structures, event schemas and database schemas.Well versed in file formats such as CSV, JSON, Parquet, Avro and Iceberg.Solid and opinionated knowledge of testing methodologies.Solid and opinionated knowledge of coding principles and coding standards.Well versed with standard SDLC practices and tooling around build, test, deploy etc.Ability to operate and thrive in a dynamic startup environment.Passionate about knowledge sharing and mentoring.Eligibility to work in London.Bonus Experience
Hands-on experience working with cloud technologies.Familiarity with Airflow, DBT, REST API, Kubernetes, Istio and Docker.Experience delivering simple web based UIs to visualize data.Experience working with petabyte datasets.Experience with middleware such as Chronicle Queue, Aeron, RabbitMQ and Kafka.Bullish is proud to be an equal opportunity employer. We are fast evolving and striving towards being a globally-diversemunity. With integrity at our core, our success is driven by a talented team of individuals and the different perspectives they are encouraged to bring to work every day. Job ID JR2001042