PinnedChetanya-PatilSubmit Databricks job using Rest API in AirflowExperience using rest api to submit job to databricks using /api/2.1/jobs/runs/submit from Airflow.3 min read·Nov 23, 2023----
PinnedChetanya-PatilinCT EngineeringSubmit Databricks Job using REST API — launched by runs submit API# Create and trigger a one-time run3 min read·Nov 21, 2023----
Chetanya-PatilSubmitting Databricks job using DatabricksSubmitRunOperator in json payload and named parameters# Create and trigger a one-time run using operators2 min read·Dec 1, 2023----
Chetanya-PatilCustomer 360The following data pipeline explains the steps to import data from MySQL to HDFS using Sqoop, load data into Spark from HDFS, and store…3 min read·Nov 27, 2021----
Chetanya-PatilApache SqoopApache Sqoop is a tool which is used to ingest data from relational database to hdfs storage. It is mainly a part of hadoop only because it…2 min read·Nov 14, 2021----
Chetanya-PatilETL Data Pipeline In AWSETL (Extract, Transform, and Load) is an emerging topic among all the IT Industries. Industries often looking for some easy solution and…7 min read·Nov 24, 2020----
Chetanya-PatilElastic Load Balancing & Auto Scaling Groups SectionThis is where we see the power of cloud that is AWS cloud.4 min read·Oct 13, 2020----
Chetanya-PatilElastic Load BalancingWhat is load balancing? • Load balancers are servers that forward internet traffic to multiple server (EC2 Instances) downstream.2 min read·Oct 13, 2020----