PinnedSubmit Databricks job using Rest API in AirflowExperience using rest api to submit job to databricks using /api/2.1/jobs/runs/submit from Airflow.Nov 23, 2023Nov 23, 2023
PinnedPublished inCT EngineeringSubmit Databricks Job using REST API — launched by runs submit API# Create and trigger a one-time runNov 21, 2023Nov 21, 2023
Submitting Databricks job using DatabricksSubmitRunOperator in json payload and named parameters# Create and trigger a one-time run using operatorsDec 1, 2023Dec 1, 2023
Customer 360The following data pipeline explains the steps to import data from MySQL to HDFS using Sqoop, load data into Spark from HDFS, and store…Nov 27, 2021Nov 27, 2021
Apache SqoopApache Sqoop is a tool which is used to ingest data from relational database to hdfs storage. It is mainly a part of hadoop only because it…Nov 14, 2021Nov 14, 2021
ETL Data Pipeline In AWSETL (Extract, Transform, and Load) is an emerging topic among all the IT Industries. Industries often looking for some easy solution and…Nov 24, 2020Nov 24, 2020
Elastic Load Balancing & Auto Scaling Groups SectionThis is where we see the power of cloud that is AWS cloud.Oct 13, 2020Oct 13, 2020
Elastic Load BalancingWhat is load balancing? • Load balancers are servers that forward internet traffic to multiple server (EC2 Instances) downstream.Oct 13, 2020Oct 13, 2020