|
Search Jobvertise Jobs
|
Jobvertise
|
Data enterprise engineer architect dremeo snowflake nwwyc Location: US-NY-Nyc Jobcode: Ref Email this job to a friend
Report this Job
Data enterprise engineer architect dremeo snowflake nwwyc
Experience level: Mid-senior Experience required: 10 Years Education level: Bachelor’s degree Job function: Information Technology Industry: Financial Services Compensation: View salary Total position: 1 Relocation assistance: No
JOB DESCRIPTION:
We are consolidating our analytics and access to our data sources by leveraging a hybrid Lakehouse architecture and building out the surrounding toolkit required to meet our user needs from across the enterprise
You will be joining our central data team at MMC Tech to help modernize our enterprise approach to data with a focus on enabling self-service capabilities
You will be integrating existing tooling and processes and moving us to the next level by identifying additional needs and suitable technologies
You will work closely with technical and business colleagues to design, build and support required functionality in compliance with our enterprise requirements
RESPONSIBILITIES:
Pushing us to adopt better practices
Building out a modern, controlled enterprise Lakehouse
Integrating supporting technologies and governance processes into the methods for accessing data
Integrating user-friendly interfaces with Lakehouse feeds
Enabling data self-service
Documenting architectures and processes for enterprise consumption
Developing reusable data assets to enable our teams to function more efficiently and at higher levels.
QUALIFICATIONS:
Significant experience implementing advanced data and analytics solutions on a modern data platform (e.g., Dremio, Databricks, Snowflake)
Experience working with Kubernetes and dealing with networking policies and configurations
Experience working with and configuring enterprise data visualization tools (e.g. Qlik, Tableau, Power BI)
Ability to effortlessly build rapport and maintain close working relationship with technical colleagues
5+ years’ experience with SQL
5+ years’ experience writing Scala, Python, or Java
You will be familiar with Git and DevOps tooling
You will be familiar with containerization and the surrounding ecosystem, including secrets management
Experience working with Agile methodologies
ADDITIONAL QUALIFICATIONS:
Experience configuring and working with Kafka
Experience working with data masking and anonymization techniques
Experience working across both modern and legacy data sources and structures
Experience working with DBT or other modern data development tools
Good understanding of Front/Backend development
Exposure to Workflow and Decision automation (e.g., Camunda, Airflow)
1.) Please elaborate candidate's experience in writing Scala, Python, or Java
2.) Please elaborate candidate's working experience with Kubernetes and dealing with networking policies and configurations
3.) Please elaborate candidate's experience in implementing advanced data and analytics solutions on a modern data platform (e.g., Dremio, Databricks, Snowflake)
4.) Please provide the link to candidate’s LinkedIn profile:
5.) What is the candidate’s work authorization status?
6.) What is the candidate’s highest level of education?
7.) Has the candidate applied or been interviewed for any role with this company in the past? If so, please provide details.
8.) What is the candidate’s desired total compensation? (Please specify base salary vs. commission/bonus expectation)
9.) Where is the candidate located? If candidate is not near the job location, please explain relocation plan in detail (e.g. timeline, relocating with family, selling/buying property)
jonathan thompson
esr
confidential San Diego, CA 92126
|