Job Responsibilities:
- Design, develop, and maintain data pipelines
- Develop optimal SQL queries against large datasets to solve business problems
- Translate complex business logic into optimum feature engineering pipelines for processing on large amount of data
- Interact with the client to understand business needs and apply analytical concepts to generate actionable business insights
- Effectively visualize data and create dashboards
Skills:
Must have:
- Very good knowledge in SQL
a. Advanced Querying and handing of large datasets
b. Writing optimized queries
c. Creating stored procedures
d. Joins, nested queries
- Exposure to Python (basic libraries such as Numpy/pandas)
- Strong verbal and business communication skills
- Strong business acumen & demonstrated aptitude for analytics that incite action
- Effective time management and attention to detail
Good to Have:
- Experience in using Google Cloud Platform
- Understanding of BigQuery architecture, table partitioning, clustering, type of tables etc.
- Exposure to data architecture and schema design
- Experience with Workflow Schedulers/ ETL jobs (Apache Airflow, Cron)
- Experience in using Python for data wrangling and manipulation
- Good knowledge of visualization tools such as Tableau & Power BI is preferred
Eligibility:
- Master's or Bachelor's degree in Math, Statistics, Economics, Computer Science or related analytics field from top-tier universities with strong record of achievement
- 2-6 Years (at least 2 years of SQL experience in analytics environment)
Didn’t find the job appropriate? Report this Job