HR - Manager at Hypersonix
Views:62 Applications:13 Rec. Actions:Recruiter Actions:6
Hypersonix - Data Architect (8-16 yrs)
Hypersonix.ai is disrupting the Business Intelligence and Analytics space with AI, ML and NLP capabilities to drive specific business insights with a conversational user experience. Hypersonix.ai has been built ground up with new age technology to simplify the consumption of data for our customers in Restaurants, Hospitality and other industry verticals.
Hypersonix.ai is seeking a Data Evangelist who can work closely with customers to understand the data sources, acquire data and drive product success by delivering insights based on customer needs.
- Lead and deliver complete application lifecycle design, development, deployment, and support for actionable BI and Advanced Analytics solutions
- Design and develop data models and ETL process for structured and unstructured data that is distributed across multiple Cloud platforms
- Develop and deliver solutions with data streaming capabilities for large volume of data
- Design, code and maintain parts of the product and drive customer adoption
- Build data acquisition strategy to onboard customer data with speed and accuracy
- Working both independently and with team members to develop, refine, implement, and scale ETL processes
- On-going support and maintenance of live-clients for their data and analytics needs
- Defining the data automation architecture to drive self-service data load capabilities
- Bachelors/Masters/Ph.D. in Computer Science, Information Systems, Data Science, Artificial Intelligence, Machine Learning or related disciplines
- 10+ years of experience guiding the development and implementation of Data architecture in structured, unstructured, and semi-structured data environments.
- Highly proficient in Big Data, data architecture, data modeling, data warehousing, data wrangling, data integration, data testing and application performance tuning
- Experience with data engineering tools and platforms such as Kafka, Spark, Databricks, Flink, Storm, Druid and Hadoop
- Strong with hands-on programming and scripting for Big Data ecosystem (Python, Scala, Spark, etc)
- Experience building batch and streaming ETL data pipelines using workflow management tools like Airflow, Luigi, NiFi, Talend etc
- Familiarity with cloud based platforms like AWS, Azure or GCP
- Experience with cloud data warehouses like Redshift and Snowflake
- Proficient in writing complex SQL queries.
- Excellent communication skills and prior experience of working closely with customers
- Data savvy who loves to understand large data trends and obsessed with data analysis
- Desire to learn about, explore, and invent new tools for solving real-world problems using data
- Cloud computing experience, Amazon Web Services (AWS)
- Prior experience in Data Warehousing concepts, multi-dimensional data models
- Full command of Analytics concepts including Dimension, KPI, Reports & Dashboards
- Prior experience of managing client implementation of Analytics projects
- Knowledge and prior experience of using machine learning tools