Search Jobvertise Jobs
Jobvertise

Big Data Architect // Bellevue, WA (Onsite)
Location:
US-TX-Richardson
Jobcode:
3575612
Email this job to a friend

Report this Job

Report this job





Incorrect company
Incorrect location
Job is expired
Job may be a scam
Other







Apply Online
or email this job to apply later

Job Title: Big Data ArchitectLocation: Bellevue, WA OnsiteJob Description:-Must Haves:Data Engineering PipelineData Modelling experienceOngoing Hands on development and an systems architecture around ETL on cloud and Good understanding of best practices around the interworking between SQL and NoSQL based environments to meeting varied business needsDesign, build, and maintain Big Data workflows/pipelines to process continuous stream of data with experience in end-to-end design and build process of Near-Real-Time and Batch Data Pipelines.Demonstrated work experience in the following with Big Data and distributed programming models and technologiesKnowledge of database structures, theories, principles and practices (both SQL and NoSQL Active development of ETL processes using Spark or other highly parallel technologies, and implementing ETL/data pipelinesExperience with Data technologies and Big Data tools, like Spark, Kafka, Hive, Scoop or other similar tools and technologiesExperience managing and ingesting data from upstream sources into Snowflake data warehouseMeaningful experience with NoSQL and big data technologies like Cassandra, Solr, Kafka and HadoopStrong experience with container orchestration frameworks (e.g., Elastic Kubernetes) preferredTechnical qualifications and experience level:1. At least 10 years of combined proven working experience as a Spark/Big Data 5-10 years in development using Java, Python, Scala, and object-oriented approaches in designing, coding, testing, and debugging programs3. 2-3 years of Architecting DataLake or other unstructured, distributed data warehouse environments.4. Ability to create simple scripts and tools.5. Development of cloud based, distributed applications6. Working knowledge of COTs ETL tools and custom developed ETL tools7. Understanding of clustering and cloud orchestration tools8. Working knowledge of database standards and end user applications9. Working knowledge of data backup, recovery, security, integrity and SQL10. Familiarity with database design, documentation and coding11. Previous experience with DBA case tools (frontend/backend) and third party tools12. Understanding of distributed file systems, and their optimal use in the commercial cloud (HDFS, S3, Google File System, Datastax Delta lake)13. Familiarity with programming languages API14. Problem solving skills and ability to think algorithmically15. Working Knowledge on RDBMS/ORDBMS like MariaDb, Oracle and PostgreSQL16. Working knowledge on Hadoop administration.17. Knowledge of SDLC (Waterfall, Agile and Scrum)18. BS degree in a computer discipline or relevant certification

tanishasystems

Apply Online
or email this job to apply later


 
Search millions of jobs

Jobseekers
Employers
Company

Jobs by Title | Resumes by Title | Top Job Searches
Privacy | Terms of Use


* Free services are subject to limitations