ETL Fundamentals, SQL, BigQuery, SQL (Basic + Advanced), Dataprep, Python, Data Warehousing, DataStudio, Composer, Dataflow, Dataplex, Modern Data Platform Fundamentals, Data Modelling Fundamentals, PLSQL, T-SQL, Stored Procedures
Specialization
GCP Data Architecture: Data Engineer
Job requirements
Job description below. • 4+ years of experience in the data engineering field is preferred • 3+ years of Hands-on experience in GCP cloud data implementation suite such as Big Query, Pub Sub, Data Flow/Apache Beam, Airflow/Composer, Cloud Storage, • Strong experience and understanding of very large-scale data architecture, solutioning, and operationalization of data warehouses, data lakes, and analytics platforms. • Hands on Strong Experience in the below technology o 1. GBQ Query o 2. Python o 3. Apache Airflow o 4. SQL (BigQuery preferred) • Extensive hands-on experience working with data using SQL and Python • Cloud Functions. Comparable skills in AWS and other cloud Big Data Engineering space is considered. • Experience with agile development methodologies • Excellent verbal and written communications skills with the ability to clearly present ideas, concepts, and solutions • Bachelor's Degree in Computer Science, Information Technology, or closely related discipline