Gathi Analytics at GeoSmart India 2019

Gathi Analytics presented @ GeoSmart India 2019 on the topic Smart services for Smart Cities.Vinit Reddy, Director of Innovation presented the Smart City Platform (SEP), explaining the ease at which cities can adopt to and benefit from the platform.

#geosmartindia2019 #smartcities


Gathi Analytics featured in CIO Review EdTech Special Edition.

“Our ‘Unique 8,8,8 Engagement Model’ is enabling Businesses to stay ahead of the Curve. The Modern Data Platform helps take an organized approach to the enormous data distributed over disparate systems”.

Call Us Today! +1 (614) 345-8646|

Data Engineer (Python_Pyspark)

San Antonio, TX

Location Experience
San Antonio, TX 8 years

Gathi Analytics is a relentlessly client focused group who builds ground-breaking data solutions for Smart Cities, Payment Services, Healthcare, Finance, and so much more. Gathi Analytics leverages proprietary frameworks and methodologies to deliver solutions faster, without compromising quality. We help our clients gain data insights they never thought possible and understand their data assets in ways they never could before. If this sounds like you, please read on.


About the Data Engineer role

We are looking for a savvy Data Engineer to join our growing team of analytics experts. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products.



  • Communicate progress across organizations and levels from individual contributor to senior executive. Identify and clarify the critical issues that need action and drive appropriate decisions and actions. Communicate results clearly and in actionable form.
  • Lead development and ongoing maintenance and enhancement of applications running on Azure Cloud and business intelligence tools.
  • Detailed technical design, conduct analysis, development of applications and proof of concepts
  • Develop microservices, application code and configuration to deliver application
  • Provide technical leadership for development & BI team to deliver on various initiatives.
  • Lead problem resolution tasks, document approach for support mechanisms
  • Ensure all solutions meet Enterprise Guidelines and industry standards/best practices
  • Advise IT and business stakeholders of alternative solutions
  • Ensure optimal system performance across BI & Analytics platforms.
  • Lead the effort to monitor system activity, tune performance and architect solutions to meet future demand.
  • Offer technical guidance to team members and lead design/requirements sessions
  • Benchmark systems, analyze system bottlenecks and propose solutions to eliminate them;
  • Articulate pros and cons of various technologies and platforms and document use cases, solutions and recommendations;
  • Troubleshoot complex system issues and handle multiple tasks simultaneously
  • Ensure all solutions meet Enterprise Guidelines and industry standards/best practices


  • Bachelor’s Degree or master’s degree in Computer Science, Mathematics, Statistics.
  • 4+ years of development experience in using Spark to build applications through Python and PySpark
  • 3+ years’ hands-on experience developing optimized, complex SQL queries and writing PLSQL code across large volumes of data in both relational and multi-dimensional data sources such as Teradata, Hive, Impala, Oracle, Teradata
  • Experience in deploying and developing application using Azure
  • Experience working with disparate data-sets in multiple formats like JSON, Avro, text files, Kafka queues, and log data and storage like blob/ADLS GEN2
  • 2+ years of strong ETL experience on either Informatica, Ab-Initio, Talend, DataStage, Syncsort
  • Enthusiastic to work on disparate datasets in multiple formats like JSON, Avro, text files, Kafka queues, and Knowledge of software design and programming principles.
  • Experience working in Scrum Agile framework and using DevOps to deploy and manage code.
  • Good communication and team-working skills

If this role sounds like you, we invite you to apply. Our recruiting team will be in touch shortly with qualified candidates regarding next steps.

Gathi Analytics is an equal opportunity employer and provides opportunities for advancement, compensation, training, and growth according to individual merit, without regard to race, color, religion, sex (including pregnancy), national origin, sexual orientation, gender identity, marital status, age, genetic information, disability, veteran-status, or any other characteristic protected under applicable Federal, state, or local law.

Apply for this job