Hire Python Developers within a week
Hire Top Remote Software Dev Whizards!
Exp : 10+ Years
$35 / hr
Nirmal K
Python Developer
Over 7+ years’ experience in Infrastructure Administration that includes AWS Cloud, IBM Cloud, Azure Cloud, VSphere Administration, Server Administration, Network Administration, on Windows and Linux environments.
Key Skills
- AWS
- Shell
- Python
- Nginx
- Apache
- Tomcat
- Jenkins
- Azure
- DevOps
- Kafka
Nirmal K
Cloud engineer
Exp : 10+ Years
$45 / hr
Cloud Engineer with over 7 years of experience in infrastructure administration, specializing in AWS, IBM Cloud, and Azure environments. Proficient in server, network, and virtualization management with a focus on automation, DevOps practices, and cloud-based application monitoring. Experienced in scripting, infrastructure-as-code, and troubleshooting across both Windows and Linux systems.
Educational Qualification:
- M.Sc. in Cyber Forensic and Information Security – Madras University, Chennai (2019-2020)
- B.Sc. in Computer Science – A.M. Jain College, Chennai (2010-2013)
Technical Skills
Cloud Platforms: AWS, IBM Cloud, Azure
Services: Nginx, Apache, Tomcat, Kafka, NPM-PM2
Databases: MongoDB, MySQL, Redis
Scripting: Shell, Python
DevOps & CI/CD: Jenkins, Azure DevOps, Ansible, Manage Engine
Virtualization: VMware, ESXi, Vcenter, Docker
Networking: MPLS, Firewall (Fortinet, Vyatta), Switches, BGP, RIP, OSPF, IPsec
Infrastructure as Code: Terraform
Monitoring & Automation Tools: Grafana, NewRelic, Nagios, N8N, Cron, SIEM
Access Management: Arcos PIM/PAM
Backup & Sync: Veeam, Azure ASR
Expertise
Cloud Infrastructure & Application Monitoring: Experience in managing AWS, IBM Cloud, and Azure cloud infrastructures, ensuring high availability and load balancing for applications. Proficient in cloud monitoring and troubleshooting using tools like Grafana, NewRelic, and Nagios.
Server & Network Management: Expertise in managing virtual machines, web servers, firewalls, and network configurations in both production and development environments, using technologies such as Apache, Tomcat, and Nginx.
Automation & Scripting: Skilled in automating tasks using Shell and Python scripting, as well as tools like Jenkins and N8N. Experience in creating CI/CD pipelines for deployment automation.
Security & Compliance: Configured SIEM monitoring tools and implemented security practices including 2FA, OS hardening, and regular vulnerability patching. Experienced with PCI DSS and CertIN compliance.
Backup & Disaster Recovery: Implemented incremental and differential backups, set up disaster recovery drills, and ensured data sync between data centers using tools like Veeam and Azure ASR.
Work Experience:
Cloud Engineer (Feb 2022 – Present):
- Managed AWS cloud infrastructure and application monitoring, configured load balancers and availability groups, and automated alert posting to relevant teams.
- Worked with automation tools (N8N, Cron) and scripting languages (Python, Shell) to create custom scripts for system management.
Lead System Administrator (Aug 2018 – Feb 2022):
- Managed IBM Cloud VMs, networks, and security groups, and deployed web applications with Apache, Tomcat, and Kafka.
- Implemented CI/CD pipelines using Azure, Git, Jenkins, and Ansible, ensuring seamless deployment in production and UAT environments.
System Support (Mar 2017 – Aug 2018):
- Provided system administration for supply chain projects, handling Active Directory, desktop administration, and ERP application support.
System Executive (Feb 2014 – Feb 2016):
- Installed, configured, and administered operating systems (Windows, RHEL, CentOS, Ubuntu) and performed user administration and server maintenance.
Exp : 5+ Years
$30 / hr
Manikanta K
Python Developer
Data Engineer with 5+ Years of experience in BI development using Big Data and Cloud services
Key Skills
- Python
- Big Data
- MS-SQL Server
- Azure SQL
- TFS
- VSTS
- Azure Data Lake
Manikanta K
Data Engineer
Exp : 5.5 Years
$30 / hr
Key Skills
- Python
- Big Data
Additional Skills
- MS-SQL server
- Azure SQL
- TFS
- VSTS
- Azure data lake
- Data factory
- SSIS
Detailed Experience
- Extensive experience working on Azure cloud and providing solutions involving several services like Datalake, VM, ADF, Azure Function, Databricks etc
- 2 years of experience working on AWS cloud and providing solutions involving several services like S3, EC2, Glue, Lambda, Athena etc
- Capable of writing complex SQL queries and able to tune the performance
- Design and Development of Big data Applications in Apache Spark, Azure
- Experience in utilizing MSSQL, Azure SQL, and Redshift.
- Excellent verbal and written communication skills and proven team player.
Exp : 5 Years
$30 / hr
Shashank
Python Developer
Data Engineer with 5 Years of experience in Python, Big Data and Cloud services
Key Skills
- Python
- SQL
- AWS
- Big Data
- Oracle
- MySQL
- SQL Server
- PostgreSQL
Shashank
Data Engineer
Exp : 5 Years
$30 / hr
Key Skills
- Python
- SQL
- AWS
- Big Data
Additional Skills
- Oracle
- MySQL
- SQL Server
- Postgres
- Apache Spark
- Pyspark
- DMS
- RDS
- Glue
- Lambda
- Dynamo
- Cloudwatch
Detailed Experience
- Proficient with AWS cloud services to develop cost-effective and accurate data pipelines and optimize them.
- Capable of handling multiple data sources like DynamoDB, RDS, JSON, text, CSV.
- Developed Pyspark scripts in Databricks to transform data and load them into data tables.
- Good experience in the creation of pipelines for loan audits, and risk analysis for RBI compliance.
- Automated the generation of PMS reports using Pyspark.
- Involvement in data migration activities and data validation post data migration.
- Expert in developing Pysprak scripts to transform data to new data models.
- Created a data pipeline for a client to price their products and an ETL pipeline to compare the pricing of their product with their direct competition.
Exp : 4+ Years
$25 / hr
Vivekanand C
Python Developer
Data Engineer with 4+ years of experience in ETL development and crafting robust Data
Warehouse solutions.
Key Skills
- AWS Services
- Python
- SQL
- Big Data
- Airflow
- Github
- JIRA
- Oracle SQL
Vivekanand C
Data Engineer
Exp : 4+ Years
$25 / hr
Key Skills
- AWS services
- Python
- SQL
- Big Data
Additional Skills
- Airflow
- Github
- JIRA
- Oracle SQL
- Jupyter
- V S Code
Detailed Experience
- Capable of leveraging a suite of technologies, including Python, SQL, PySpark, and AWS services like EMR, Glue, Redshift, Athena, EC2, and S3, to transform raw data into actionable insights.
- Development and implementation of ETL solutions using Python, PySpark, SQL, and AWS services, particularly AWS Glue and AWS EMR.
- Proficient in orchestrating ETL Data Pipelines using Apache Airflow, integrating S3 as a Data Lake, Glue for Data Transformation, and Redshift for Data Warehousing to create end-to-end ETL pipelines.
- Testing and data validation using Athena to ensure data accuracy and reliability after transformation.
- Successfull implementation of robust Data Warehousing solutions with Redshift to streamline downstream data consumption.
- Building Data Pipelines, Data Lakes, and Data Warehouses while demonstrating strong knowledge of normalization, Slowly Changing Dimension (SCD) handling, Fact and Dimension tables.
- Extensive familiarity with a range of AWS services, including EMR, Glue, Redshift, S3, Athena, Lambda, EC2, and IAM, facilitating comprehensive data engineering solutions.
- Expertise in Oracle Database, adept at crafting complex SQL queries for data retrieval and manipulation.
- Sound understanding of SQL concepts such as views, subqueries, joins, string, window, and date functions.
- Proficient in PySpark concepts, including advanced joins, Spark architecture, performance optimization, RDDs, and Dataframes.
- Skilled in performance tuning and optimization of Spark jobs, utilizing tools like Spark Web UI, Spark History Server, and Cluster logs.
Exp : 4 Years
$25 / hr
Ashok A R
Python Developer
Data Engineer with 2+ years of specialization in ETL, data warehousing, and cross-functional collaboration.
Key Skills
- Python
- Data Science
- AWS Services
- ETL
- HDFS
- PySpark
- Hive
- Pandas
Ashok A R
Data Engineer
Exp : 4 Years
$25 / hr
Key Skills
- Python
- Data Science
- AWS
Additional Skills
- ETL
- HDFS
- PySpark
- Hive
- Pandas
- Data Warehousing
- Kafka
- MySQL
- MongoDB
- NumPy
- Seaborn
- TensorFlow
- Scikit-Learn
- Tableau
- EC2
- S3
- RDS
- Glue
- Athena
- EMR
- Redshift
- Lambda
- Kinesis
- DynamoDB
- Boto3
- Docker
- Jenkins
- Github
- Git
- Airflow
- SQL
- NoSQL
- C++
Detailed Experience
- Design and implementation of robust Python frameworks utilizing PySpark and Boto3 in Databricks Notebooks. These frameworks were used for data processing, unit testing, and interaction with cloud services, contributing to enhanced efficiency and data quality.
- Usage of Amazon EMR to process and analyze large-scale datasets, applying advanced Spark transformations for feature engineering and data enrichment in machine learning models.
- Implementation of end-to-end data encryption using AWS Key Management Service (KMS) and SSL/TLS protocols, ensuring data security and compliance with industry standards.
- Migration of legacy data pipelines to AWS Glue and achieved a 60% reduction in maintenance effort, leading to improved pipeline stability and reduced downtime.
- Collaboration with data scientists to deploy machine learning models on AWS SageMaker, enabling real-time predictions and recommendations for customer behavior.
- Authoring comprehensive technical documentation and knowledge base articles, facilitating efficient onboarding of new team members, and promoting best practices.
Exp : 4 Years
$25 / hr
Shreya R
Python Developer
Data Engineer with 4 years of experience in building data-intensive applications, tackling architectural and scalability challenges.
Key Skills
- AWS
- Python
- PySpark
- Django
- Flask
- MySQL
- PostgreSQL
- MongoDB
Shreya R
Data Engineer
Exp : 4 Years
$25 / hr
Key Skills
- AWS
- Python
Additional Skills
- PySpark
- Django
- Flask
- MySQL
- PostgreSQL
- MongoDB
- GitHub
- Jira
- Docker
- Jenkins
Detailed Experience
- Expertise in developing data pipelines using AWS services such as EC2, ECS, Glue, Airflow and Lambda for efficient data processing and management.
- Proficient in working with AWS S3 for data storage and retrieval, integrating it with Spark and PySpark to enable powerful data processing capabilities.
- Developed ETL workflows using PySpark and Glue to transform, validate, and load large volumes of data from diverse sources into AWS data lakes.
- Experienced in designing and implementing scalable data architectures in AWS, including data modeling and database design utilizing Redshift and RDS technologies.
- Analyzed SQL scripts and optimized performance using PySpark SQL.
- Ability to work independently with minimum supervision in a team environment, with strong problem-solving and interpersonal skills.
- Prior experience as a web developer, utilizing Python, Django, and Flask frameworks for web development projects, while utilizing Git for version control and collaborative development.
- Skilled in data processing and analysis using Python libraries such as pandas.
- Experienced in working with relational databases and writing complex SQL queries for data extraction and manipulation.
- Familiarity with serverless computing using AWS Lambda, enabling cost-effective and scalable execution of data processing tasks.
- Excellent communication skills, collaborating effectively with cross-functional teams and stakeholders to drive project success.
- Proficient in using Git for version control and collaborative development.
Exp : 4 Years
$25 / hr
Rohit M
Python Developer
Data Engineer with 3+ years of relevant experience on the Big Data platform and AWS services.
Key Skills
- AWS Services
- Python
- PySpark
- Flask
- Django
- REST APIs
- MySQL
- MongoDB
Rohit M
Data Engineer
Exp : 4 Years
$25 / hr
Key Skills
- Python
- PySpark
- AWS
Additional Skills
- Flask
- Django
- REST APIs
- MySQL
- MongoDB
- PostgreSQL
- GIT
- Docker
- Bamboo
- Bit Bucket
- Spark Streaming
Detailed Experience
- Experience in building data pipelines using AWSservices such as EC2, ECS, Glue and Lambda.
- Involved in writing Spark SQL scripts for data processing as per business requirements.
- Exception Handling and performance optimization techniques on python scripts using spark data frames.
- Expertise in developing business logic in Python, PySpark.
- Good experience in writing queries in SQL.
- Proficient in working with data storage and retrieval using AWS S3 and integrating it with Spark and PySpark for efficient data processing.
- Development of ETL workflows using PySpark and Glue to transform, validate, and load large amounts of data from various sources to the AWS data lake.
- Expertise in designing and implementing scalable data architectures in AWS, including data modeling and database design using technologies like Redshift and RDS.
- Strong experience in using tools like GIT, Docker, JIRA
- Proficient in programming by using the IDE’s such as Eclipse, PyCharm, VS Code
- Hands-on experience in spark Streaming.
- Usage of Databricks for a variety of big data use cases, such as data preparation, ETL, data exploration and visualization, machine learning, and real-time analytics.
Exp : 2+ Years
$25 / hr
Abhishek M
Python Developer
Key Skills
- AWS Services
- Python
- Google cloud platform
- Azure
- Linux
- Docker
- Kubernetes
Abhishek M
Cloud engineer
Exp : 2+ Years
$25 / hr
Key Skills
- AWS
- Google cloud platform
- Azure
- Linux
- Python
- Docker
- Kubernetes
- Jenkins
Additional Skills
- Terraform
- ECS
- AWS CLI
- GCP CLI
- GKE
- CI-CD
- EKS
- Lambda
- Django
Detailed Experience
- Collaborated with frontend and backend developer teams for deploying three web applications on Google Kubernetes Engine (GKE), involving the creation of Docker images for VS Code.
- Established and managed CI/CD pipelines using Cloud Build, ensuring automated deployment processes and overseeing pipeline health.
- Initiated, managed, and maintained applications in GKE, databases, cloud storage, Access Management, and credentials.
- Led data migration and server migration from AWS to GCP, involving Python/Django coding responsibilities for both platforms.
- Designed cloud architecture for web applications in Google Cloud Platform (GCP), considering budget constraints and implementing cost optimization strategies.
- Engaged in collaborative discussions with senior management, providing insights into cloud architecture workflows within specified timelines.
- Demonstrated hands-on experience with Elastic Beanstalk, S3, EKS, ECS, RDS, EC2, Lambda, CloudFront, Auto Scaling, DNS, AWS WAF, KMS, IAM, Global Accelerator, and Data Migrations, including server migration.
- Developed end-to-end CI/CD pipelines for projects in GCP using Cloud Build, encompassing code retrieval, application compilation, testing, and artifact pushing to the project.
- Acquired skills in AWS, CI/CD, ECS, EKS, AWS CLI, Lambda, Linux, Python, Django, GCP, cost optimization, Agile software development, GKE, and Docker.