• Hi!
    I'm Kavil

    Transforming Ideas into Reality with Code

    Actively looking for Full time roles.

    Download CV

  • Innovative Solutions for the Digital World

    Connect on Linkedin

About Me

Who Am I?

Hi I'm Kavil. I am currently pursuing a Master of Science in Computer Science at Arizona State University, where I have been honing my skills and expanding my knowledge in various areas of computer science. Prior to that, I completed my Bachelor of Engineering in Information Technology at LD College of Engineering, Gujarat Technological University.

A self-proclaimed tech geek with an undying love for coding and a slightly unhealthy addiction to coffee. I live in a world where bugs are the bane of my existence, but hey, that's why I keep a trusty fly swatter by my desk. When I'm not battling rogue lines of code, you can find me cracking cheesy programming jokes that only a fellow nerd would appreciate. So, buckle up and get ready for a wild ride through my tech-infused sense of humor!

Data Engineering

Machine Learning

Web/Mobile Software

Application Dev/Maintenance

Client-Centric Excellence: Fulfilling Your Project Goals with Precision!

Education

Arizona State University Tempe, Arizona, USA

Activities and societies:
Software Developers Association (SoDA) Club: Active member at ASU, collaborating on software development projects and organizing tech events.
Codedevils Club: Contributed to coding challenges and workshops, enhancing coding skills and fostering a vibrant tech community at ASU. Ranked 4th at ASU Code Challenge Hackathon.

Courses Taken:

  • CSE 575 Statistical Machine Learning
  • CSE 545 Software Security
  • CSE 510 Database Management System Implementation
  • CSE 511 Data Processing at Scale
  • CSE 565 Software Verification/Validation/Test
  • CSE 579 Knowledge Representation and Reasoning
  • CSE 546 Cloud Computing
  • CSE 572 Data Mining
  • CSE 578 Data Visualization
  • CSE 539 Applied Cryptography
Note: I will normally be graduated by December'24 but if full-time opportunity is available, I can graduate earliest by May'24 by taking extra credits.

LD College of Engineering, Gujarat Technological University Ahmedabad,India

Activities and societies:
As part of the ROBOCON Club, I coordinated multiple workshops and seminars on robotics during the 2017-2018 academic year. I not only won the "Design Problem" contest at the 2019 National ISTE Convention but also contributed as a volunteer in fundraising efforts. In addition, serving as a Training and Placement Officer at my college and being the Class Representative enriched my leadership capabilities and team dynamics understanding.

Courses Taken:

  • Algorithms and Data Structures
  • Big Data Analytics
  • Data Mining & Business Intelligence
  • Mobile Computing and Wireless Communication
  • Artificial Intelligence
  • Database Management Systems
  • Software Engineering
  • Web Technology
  • Computer Architecture and Organization
  • Advance Java Programming
  • Python programming
  • Information and Network Security
  • Engineering Economics

My Skills

I possess a wide range of programming languages, including Java, Python, JavaScript, C, C++, HTML, SASS, PHP, and more. I am proficient in various frameworks and tools such as Spring Boot, Django, Angular, React, and Bootstrap. My expertise extends to areas like Restful APIs, Hadoop, Selenium, Kafka, TensorFlow, Pandas, and SQL/NoSQL databases. I am well-versed in DevOps practices, using tools like Git, Jenkins, Docker, and Kubernetes. Additionally, I have experience with agile methodologies, cloud platforms such as Azure, GCP, and AWS, and unit/manual/automation testing.

Programming languages

Java
Python
C
C++
Type Script
PHP
C#
HTML
CSS
SASS
Jquery
MATLAB
SAS
R

Front-end Frameworks

React JS
Angular
Redux
Bootstrap
Material UI

Back-end Frameworks

Spring Boot
Django
Node JS
Flask
Wordpress

Data Engineering Tools

Hadoop
Spark
Kafka
Power BI
Tableau
Weka

Databases

MySQL
Oracle
MongoDB
SQL Server
PostgreSQL

Machine Learning Libraries

Numpy
Pandas
Scikit Learn
Tensorflow
Keras

DevOps Technologies

Git
Docker
Kubernetes
Jenkins
Ansible
Powershell
Bash/Shell Scripting

Cloud Technology

AWS
Google Cloud
Microsoft Azure

Agile and Other

Systems development life cycle (SDLC)

Scrum

Jira

Kanban

Unit/Manual/Automation Testing

Code Review

Code Documentation

JUnit & Selenium

GraphQL

Restful APIs & Postman

OAuth 2.0

Work Experience

Cloud Platform Engineer Intern (Choice Hotels International, US) June 2024 - August 2024

  • Successfully migrated 3 high-traffic applications from Oracle to AWS Aurora-PostgreSQL database and SSO from LDAP to Okta-OIDC.
  • Assisted in developing and maintaining 5+ CI/CD pipelines and automation workflows using AWS Lambda and Harness.
  • Monitored & improved cloud security compliance, performance, and IAM, achieving a 17% efficiency boost.
  • Implemented IaC using Terraform and CloudFormation, improving deployment consistency and reducing deployment time by 20%.

Custom Software Development Analyst (Accenture(Client: Elevance Health,US) ,Base location:India ) June 2021-Dec 2022

  • Migrated applications from a monolithic JSF framework to containerized micro-services architecture using Spring Boot and Angular.
  • Improved operational efficiency by optimizing underperforming workflows, achieving a 30% performance boost.
  • Developed 15+ data extraction batch programs using Java, Python, SQL, and PowerBI to support data-driven decision-making.
  • Led data analytics initiatives to identify healthcare trends, enhancing early disease detection and patient outcomes.
  • Integrated healthcare systems with Apache Hadoop, ensuring data compliance, enabling cloud-native solutions, and strengthening software delivery and backup strategies for a major U.S. healthcare client.
  • Followed full Software Development Life Cycle (SDLC) and Agile best practices, reducing software defects by 20%.
Cups of coffee
Projects
Clients
My Projects

Personal Projects

  • Objective of the Project / Problem Statement:
    The AWS-Optimized Smart Healthcare Hub aims to revolutionize real-time patient monitoring by leveraging AWS cloud solutions and machine learning. The project addresses the need for predictive health analytics by integrating IoT-enabled patient data ingestion, secure processing, and AI-driven risk prediction to enhance healthcare decision-making and patient safety.
  • Approach, Design, and Implementation of the Project:
    The project utilizes a fully serverless architecture with AWS services to ensure scalability, security, and low-latency processing.
    • Data Ingestion: IoT devices collect patient health data and stream it to AWS Kinesis for real-time processing.
    • Processing & Prediction: AWS Lambda functions clean and process data, feeding it into AWS SageMaker, which uses ML models to predict health risks.
    • Dashboard & User Experience: A React-based dashboard integrates with Electronic Health Records (EHR) systems, displaying actionable insights for medical professionals.
    • Security & Compliance: Ensures HIPAA compliance with encryption, access controls, and AWS IAM role-based authentication.
  • Results:
    The implementation of the AWS-Optimized Smart Healthcare Hub achieved significant outcomes:
    • Real-time patient monitoring with high accuracy in detecting health risks (89% accuracy).
    • Reduced response time for critical health events by integrating predictive analytics.
    • Enhanced security and compliance with AWS’s built-in encryption and HIPAA regulations.
    • Improved healthcare efficiency with an intuitive, user-friendly dashboard.
  • Key Technologies Used:
    • Cloud Services: AWS Lambda, AWS Kinesis, AWS SageMaker
    • Frontend: React.js
    • Backend & Data Processing: Python (serverless architecture)
    • Security & Compliance: AWS IAM, HIPAA-compliant encryption
  • How it's Different?
    • Utilizes AWS’s serverless architecture to provide cost-efficient, scalable healthcare solutions.
    • Real-time ML-based risk prediction enables proactive healthcare decision-making.
    • Seamless integration with EHR systems for comprehensive patient data access.
    • Ensures compliance with industry standards, including HIPAA, for secure patient data handling.
  • Objective of the Project / Problem Statement:
    Artistica is a Django-based social media platform that aims to bring together art enthusiasts and creators by providing a space to share and engage with artwork. The project addresses the need for a creative online community that supports AI-powered content moderation and art recognition to maintain a safe and inspiring environment.
  • Approach, Design, and Implementation of the Project:
    Artistica was designed as a scalable, AI-powered social media platform with the following components:
    • Frontend & User Interface: Built with Bootstrap for a responsive design, ensuring seamless user experience across devices.
    • Backend & Database: Django and MySQL handle user data, posts, and interactions efficiently.
    • AI & Machine Learning: TensorFlow-based deep learning models enable content moderation and intelligent art recognition.
    • Social Media Integration: APIs for Facebook, Twitter, Reddit, and YouTube allow users to share their artwork across platforms.
    • Cloud Deployment: Google Cloud provides scalable hosting and storage solutions for efficient platform performance.
  • Results:
    The implementation of Artistica yielded impressive outcomes:
    • AI-powered content moderation ensures a respectful and engaging environment.
    • Art recognition system achieves a 94% accuracy rate in detecting inappropriate content and predicting artists.
    • Seamless multi-platform content sharing enhances user engagement and visibility.
    • Interactive features such as likes, comments, and artistic transformations create an immersive user experience.
  • Key Technologies Used:
    • Backend: Python, Django
    • Frontend: Bootstrap
    • Database: MySQL
    • AI & ML: TensorFlow (art recognition, content moderation, style transformation)
    • Cloud: Google Cloud (hosting and storage)
    • Social Media APIs: Facebook, Twitter, Reddit, YouTube
  • How it's Different?
    • AI-powered content moderation ensures a safe and creative space for artists.
    • Advanced art recognition system predicts the artist and style with 94% accuracy.
    • Multi-platform content sharing provides seamless social media engagement.
    • AI-based style transformation offers unique artistic experiences.
  • Objective of the Project / Problem Statement:
    In today’s fast-paced financial markets, investors struggle to make informed investment decisions due to the sheer volume of real-time data and market fluctuations. Traditional portfolio management tools offer static analysis, but they lack dynamic adaptability and real-time insights. This project aims to build an AI-powered investment platform that provides personalized portfolio optimization and real-time market trend predictions.
  • Approach, Design, and Implementation of the Project:
    The project utilizes a scalable and real-time architecture to ensure efficient investment decision-making.
    • Data Ingestion: Uses Kafka for real-time stock data streaming and processing.
    • AI & Machine Learning: TensorFlow-based models for trend prediction, risk assessment, and automated investment recommendations.
    • Backend & API: Java Spring Boot microservices provide portfolio management, authentication, and AI-driven insights.
    • Frontend Dashboard: Angular-based UI with WebSocket integration for real-time alerts and stock tracking.
    • Security & Deployment: AWS-based deployment with OAuth2 authentication, role-based access control, and multi-factor authentication.
  • Results:
    The implementation of the AI-powered Smart Investment Portfolio Manager has achieved the following outcomes:
    • Real-time stock tracking and market insights, improving investment decision-making.
    • AI-driven personalized investment recommendations based on user risk profiles.
    • Scalable cloud-native deployment, ensuring high availability and low latency.
    • Enhanced security measures with robust authentication and encrypted transactions.
  • Key Technologies Used:
    • Backend: Java, Spring Boot, REST APIs
    • Frontend: Angular, WebSockets
    • AI & Machine Learning: TensorFlow for risk assessment, trend prediction, and recommendations
    • Event Streaming: Apache Kafka for real-time data processing
    • Security: OAuth2, JWT, Role-based Access Control
    • Cloud Deployment: AWS (Lambda, S3, DynamoDB/RDS, Kubernetes for scalability)
  • How it's Different?
    • Real-time AI-driven decision-making for adaptive investment strategies.
    • Personalized portfolio optimization tailored to user risk preferences.
    • Event-driven architecture with Kafka for instantaneous stock tracking.
    • Scalable cloud-native deployment ensuring uninterrupted access.
    • Advanced security with encrypted transactions and regulatory compliance.
  • Objective of the Project / Problem Statement:
    The Multilingual E-Commerce Analytics Dashboard provides businesses with real-time insights into sales and customer behavior through an intuitive and customizable dashboard. By integrating predictive analytics and scalable data processing, the project addresses the need for fast, accurate, and actionable business intelligence in the e-commerce sector.
  • Approach, Design, and Implementation of the Project:
    The project leverages a modern, scalable architecture to deliver real-time analytics with seamless integration across multiple e-commerce platforms.
    • Real-time Data Processing: Apache Spark processes incoming sales and inventory data instantly, ensuring up-to-the-second insights.
    • Advanced Search Capabilities: Elasticsearch enables fast and precise data retrieval, enhancing decision-making processes.
    • Frontend & User Interface: A React-based dashboard ensures an engaging, responsive, and intuitive user experience across devices.
    • State Management: Redux is used for efficient application state management, ensuring smooth interactions and updates.
    • Predictive Analytics: Machine learning models forecast sales trends, identify inventory bottlenecks, and optimize stock management.
    • Security & Authorization: Secure authentication and role-based access controls ensure that sensitive business data remains protected.
    • Integration with E-Commerce Platforms: Supports seamless data ingestion from Shopify, WooCommerce, Magento, and other major platforms.
  • Results:
    The implementation of the Multilingual E-Commerce Analytics Dashboard delivers:
    • Real-time tracking of sales, customer behavior, and inventory updates.
    • Improved business decision-making with predictive analytics and comprehensive reporting.
    • Enhanced search capabilities, allowing businesses to retrieve insights instantly.
    • Secure and customizable dashboards, enabling users to tailor their analytics experience.
  • Key Technologies Used:
    • Backend & Data Processing: Java, Apache Spark, HBase
    • Frontend: React.js, Redux
    • Search & Query: Elasticsearch
    • Security: User authentication and authorization
    • Data Integration: APIs for Shopify, WooCommerce, Magento
  • How it's Different?
    • Offers multilingual support, making it accessible for global e-commerce businesses.
    • Utilizes Apache Spark for real-time data processing, ensuring immediate insights.
    • Advanced Elasticsearch-powered search for precise data retrieval.
    • Fully customizable dashboards tailored to business needs, with drag-and-drop functionality.
    • Predictive analytics for sales forecasting and inventory optimization, reducing revenue loss.
  • Objective of the Project / Problem Statement:
    The goal of this project is to analyze and compare state-of-the-art network representation learning methods for link prediction. The study focuses on evaluating their effectiveness in predicting missing or future links in networks using datasets such as CORA and SNAP-Facebook. This research aims to enhance the understanding of how different network embedding techniques perform in real-world applications.
  • Approach, Design, and Implementation of the Project:
    The project follows a structured pipeline to systematically evaluate various network representation learning methods.
    • Data Preprocessing: The CORA and SNAP-Facebook datasets are cleaned and formatted for network analysis.
    • Model Selection: Various network embedding methods such as Node2Vec, DeepWalk, and Graph Neural Networks (GNNs) are implemented using Python.
    • Link Prediction: Models generate feature representations of nodes, which are then used to predict missing or future connections.
    • Performance Evaluation: Metrics such as precision, recall, F1-score, and ROC-AUC are used to assess the accuracy and effectiveness of each method.
  • Results:
    The comparative analysis of different network representation learning techniques provided the following insights:
    • High accuracy in link prediction was achieved using Graph Neural Networks, outperforming traditional approaches.
    • Node2Vec and DeepWalk demonstrated strong performance on sparse networks but struggled with highly dynamic graphs.
    • ROC-AUC scores indicated that model selection depends on the dataset structure and sparsity.
    • Findings contribute to better selection of machine learning models for real-world network applications.
  • Key Technologies Used:
    • Programming & Libraries: Python, Scikit-Learn, TensorFlow, NumPy, Pandas, Matplotlib
    • Network Representation Learning: Node2Vec, DeepWalk, Graph Neural Networks (GNNs)
    • Performance Evaluation: Precision, Recall, F1-score, ROC-AUC
  • How it's Different?
    • Comprehensive comparison of multiple network embedding techniques in a single study.
    • Evaluation performed on both citation (CORA) and social network (SNAP-Facebook) datasets.
    • Uses multiple performance metrics for a holistic assessment of model effectiveness.
    • Provides insights into dataset-specific advantages of different link prediction methods.
  • Objective of the Project / Problem Statement:
    The Integrated Disaster Response and Relief System (IDRRS) is designed to optimize disaster management efforts by leveraging technology and data-driven strategies. The project aims to enhance emergency response, streamline relief operations, and improve coordination between agencies in crisis situations.
  • Approach, Design, and Implementation of the Project:
    The system is structured with several core modules to ensure effective disaster response and relief efforts:
    • Volunteer Management: Allows for the efficient deployment of rescue teams and resources while tracking their movements using GPS technology.
    • Logistic Management: Ensures optimized supply chain governance for the timely delivery of essential supplies to affected areas.
    • Mass Communication Aggregator: Uses AI-driven supervised learning algorithms to filter and verify data from social media, identifying genuine victims and transmitting accurate information to authorities. Additionally, it generates real-time heatmaps for affected regions to improve resource allocation.
    • Resource Optimization: Leverages real-time data and predictive analytics to distribute resources efficiently, ensuring aid reaches the areas with the highest need.
    • Disaster Prediction: Utilizes historical data and weather forecasts to provide early warnings, helping authorities take preemptive measures.
  • Results:
    The implementation of IDRRS led to the following key outcomes:
    • Improved response efficiency through real-time volunteer tracking and coordination.
    • Optimized logistics and resource allocation, reducing delays in disaster relief efforts.
    • Enhanced accuracy in disaster impact assessment using AI-driven social media analysis.
    • Early warning alerts enabled authorities to take proactive measures in disaster-prone areas.
  • Key Technologies Used:
    • AI & Machine Learning: Supervised learning algorithms for social media data analysis.
    • Cloud Computing: Scalable backend infrastructure for data processing and analytics.
    • GPS & Geospatial Analysis: Real-time tracking of volunteers and relief teams.
    • Big Data Analytics: Predictive analytics for disaster forecasting and resource optimization.
  • How it's Different?
    • AI-driven social media analysis ensures accurate information filtering and victim identification.
    • Real-time volunteer tracking improves disaster response efficiency.
    • Predictive analytics optimize resource distribution and supply chain management.
    • Integrated disaster prediction allows authorities to act before a disaster strikes.
  • Objective of the Project / Problem Statement:
    The Intelligent Epidemic Surveillance project aimed to provide analytics on potential epidemic hotspots across India. By integrating clinical data from government and selected non-government clinics, the system enabled early detection and response to emerging health threats, ensuring proactive epidemic management.
  • Approach, Design, and Implementation of the Project:
    The system was designed as a scalable data-driven surveillance platform with the following components:
    • Data Integration: Collected patient symptoms, doctor diagnoses, and demographic data from clinics.
    • Geographic Analysis: Mapped epidemiological trends with real-time geographic data to detect high-risk areas.
    • Historical Data Modeling: Used past epidemic records to model potential outbreak patterns.
    • Unique Patient Identification: Leveraged Aadhar card details to track individual patient records securely.
  • Results:
    The implementation of the Intelligent Epidemic Surveillance system led to:
    • Improved early detection of epidemic outbreaks with predictive analytics.
    • Enhanced public health response by providing real-time hotspot identification.
    • Accurate tracking of patient histories while maintaining data security and privacy.
    • Optimized resource allocation for healthcare authorities to prevent outbreaks.
  • Key Technologies Used:
    • Data Analytics: Python, Pandas, NumPy for epidemiological analysis
    • Database: MySQL for storing clinical and demographic data
    • Geospatial Mapping: GIS and Google Maps API for epidemic hotspot visualization
    • Security: Aadhar-based authentication ensuring data privacy
  • How it's Different?
    • Uses real-time clinical data to predict epidemic trends rather than relying solely on historical statistics.
    • Integrates geospatial mapping for hotspot detection, providing actionable insights for healthcare authorities.
    • Utilizes Aadhar card-based identification to maintain patient uniqueness and prevent duplicate records.
    • Facilitates proactive epidemic prevention by enabling early intervention measures.

1) Objective of the Project / Problem Statement

The real estate market lacks an efficient and automated method for accurately predicting property values. Traditional valuation techniques rely on manual appraisals, which are time-consuming and subject to human error. This project aims to develop an Automated Real Estate Valuation System that leverages machine learning to provide accurate and instant property value predictions. By integrating cloud computing and a user-friendly interface, the system enhances accessibility and efficiency for buyers, sellers, and real estate professionals.

2) Approach, Design, and Implementation

  • Frontend (Angular): Developed a responsive UI that allows users to input property details and instantly receive an estimated valuation.
  • Backend (C# .NET, Azure Functions): Implemented serverless computing for scalable and high-performance data processing.
  • Machine Learning Integration: Trained models using historical property sales data, location trends, and market analysis to provide accurate valuations.
  • Data Storage: Utilized Azure SQL Database for storing property records and valuation results.
  • API & Cloud Computing: Developed REST APIs for seamless data communication and deployed the solution on Azure for high availability and reliability.

3) Results

After implementation, the Automated Real Estate Valuation System successfully delivers real-time property value predictions with high accuracy. It eliminates the need for manual appraisals and accelerates decision-making in real estate transactions. Users benefit from an easy-to-use interface, while real estate professionals gain access to data-driven insights that improve market analysis and investment decisions.

The system has proven to be scalable and efficient, handling large datasets and serving multiple users simultaneously without performance degradation.

4) Key Technologies Used

  • Frontend: Angular (Responsive UI, Dynamic Data Binding)
  • Backend: C# .NET (REST API, Business Logic)
  • Cloud Computing: Microsoft Azure (Azure Functions, SQL Database)
  • Machine Learning: ML Models for property valuation predictions

5) How it's different?

  • AI-Powered Accuracy: Uses machine learning for precise property valuation, reducing human error.
  • Cloud-Optimized Performance: Serverless architecture ensures scalability and cost efficiency.
  • Instant Valuation: Provides real-time property value predictions, unlike traditional manual assessments.
  • User-Friendly Interface: Designed for real estate professionals and homeowners to access valuations effortlessly.

1) Objective of the Project / Problem Statement

The hospitality industry faces challenges in managing real-time room availability, dynamic pricing, and personalized guest experiences. Traditional booking systems lack adaptability and struggle to optimize revenue based on demand fluctuations. This project aims to develop a Smart Hotel Booking & Management System that enhances reservation efficiency, automates pricing strategies, and personalizes guest experiences using AI-driven insights.

2) Approach, Design, and Implementation

  • Frontend (Angular): Developed a user-friendly booking portal with real-time room availability and intuitive navigation.
  • Backend (C# .NET, Azure Functions): Implemented serverless computing to handle high volumes of booking requests efficiently.
  • AI-Powered Personalization: Integrated machine learning models to analyze guest preferences and provide tailored booking recommendations.
  • Dynamic Pricing Engine: Implemented AI-driven algorithms to adjust room rates based on demand, seasonality, and occupancy trends.
  • Admin Panel: Built a responsive Angular-based interface for hotel staff to manage reservations, billing, and guest profiles seamlessly.

3) Results

After implementation, the Smart Hotel Booking & Management System significantly improves booking efficiency and enhances the guest experience. Hotels benefit from an optimized revenue model due to dynamic pricing, while guests enjoy personalized recommendations that suit their preferences.

The system ensures seamless hotel operations, from check-in to billing, reducing administrative overhead and enhancing overall guest satisfaction.

4) Key Technologies Used

  • Frontend: Angular (Interactive UI, Booking Interface)
  • Backend: C# .NET (REST API, Business Logic)
  • Cloud Computing: Microsoft Azure (Azure Functions, SQL Database)
  • Machine Learning: AI models for customer preference analysis and dynamic pricing

5) How it's different?

  • Real-Time Room Availability: Ensures accurate room inventory updates for seamless booking experiences.
  • AI-Driven Personalization: Provides tailored booking suggestions based on guest preferences and behavior.
  • Dynamic Pricing Model: Automatically adjusts rates to maximize revenue and optimize occupancy levels.
  • Efficient Hotel Management: Integrated admin panel simplifies reservations, billing, and guest interactions.

1) Objective of the Project / Problem Statement

Traditional relational database systems like Minibase store data in tuples, making them inefficient for handling large-scale distributed applications that require high availability and fault tolerance. This project aims to modify Minibase to support a wide-column store similar to BigTable/HBase, improving data access patterns for large-scale applications. By introducing a Map-based data model and versioning, we enhance Minibase to handle time-series and structured data more efficiently.

2) Approach, Design, and Implementation

  • Data Model: We extend Minibase with a new Map construct containing four fields:
    (row: string, column: string, time: int) → (value: string)
  • Versioning Support: The system maintains the three most recent versions of each Map entry to enable historical data retrieval and consistency.
  • Indexing & Query Optimization: Implemented efficient indexing to allow fast retrieval of wide-columnar data, optimizing queries for performance.
  • Scalability: Enhanced Minibase to support efficient read and write operations for wide-column storage, making it adaptable for large-scale data applications.

3) Results

The modified Minibase implementation successfully provides a scalable and structured wide-column store, offering improved efficiency over traditional tuple-based storage. By supporting versioning, the system allows users to access historical data while maintaining optimal query performance.

The enhancements make Minibase more flexible for applications requiring structured data storage, such as time-series databases and real-time analytics.

4) Key Technologies Used

  • Database System: Minibase (modified for wide-column storage)
  • Data Storage: Wide-column store architecture
  • Indexing: Optimized indexing for fast lookups
  • Versioning: Support for the last three versions of each data entry

5) How it's different?

  • Column-Oriented Storage: Unlike traditional Minibase, this implementation supports a wide-column store similar to BigTable.
  • Versioning Support: Maintains multiple versions of data, allowing historical queries.
  • Optimized for Large Datasets: Designed to handle high-scale applications with efficient data retrieval.
  • Flexible Query Model: Enables structured access to time-series and columnar data for analytical workloads.

1) Objective of the Project / Problem Statement

Many plant owners struggle to maintain the right environment for their indoor plants due to a lack of knowledge about water, light, humidity, and soil conditions. SmartPlant aims to provide an intelligent, AI-driven monitoring system that helps users optimize plant care by analyzing real-time sensor data and providing actionable insights.

2) Approach, Design, and Implementation

  • Frontend (Angular): A dynamic web interface displaying real-time plant health data and care recommendations.
  • Backend (Spring Boot, PostgreSQL): A microservices-based architecture storing and processing sensor data.
  • Machine Learning (TensorFlow): Trained models analyze environmental data and suggest watering schedules, light exposure adjustments, and nutrient levels.
  • IoT Integration: Uses MQTT to connect with sensors for real-time monitoring of soil moisture, temperature, and humidity.
  • Cloud Deployment: Deployed on AWS with auto-scaling capabilities to handle multiple users seamlessly.

3) Results

After implementation, SmartPlant successfully provides real-time insights into plant health, ensuring optimal growth conditions. Users experience a significant reduction in plant care mistakes and an increase in plant longevity.

The system has also proven to be scalable, supporting multiple plant types with customized recommendations.

4) Key Technologies Used

  • Frontend: Angular (Real-time Data Visualization, User Interface)
  • Backend: Spring Boot, PostgreSQL (Microservices, Database Management)
  • Machine Learning: TensorFlow (AI-driven plant care recommendations)
  • IoT Integration: MQTT (Sensor Communication and Real-time Updates)
  • Cloud Deployment: AWS (Scalability and High Availability)

5) How it's different?

  • AI-Powered Insights: Uses machine learning to provide personalized plant care suggestions.
  • IoT-Enabled Monitoring: Real-time sensor data analysis ensures timely alerts and actions.
  • Cloud-Based Solution: Enables seamless access and scalability for multiple users.
  • Custom Recommendations: Adapts to different plant species based on environmental conditions.

Description: My graphic design project showcases a range of creative and visually stunning designs crafted to meet diverse client needs.

A Sample Flyer for DJ Party

A Sample Flyer for Diwali Party

A Sample News Letter for Promotion

My Live Projects from Industry

Ajeenkya DY Patil University

Ajeenkya DY Patil University is a vibrant educational institution committed to nurturing innovation and creativity among its students. With state-of-the-art facilities and a diverse range of programs, it provides a dynamic learning environment that empowers future leaders. ADYPU's strong emphasis on industry-relevant skills and holistic development ensures graduates are well-prepared to excel in their chosen fields.

9th House

9th House is an experience and hospitality design studio offering affordable and accessible design services to the hospitality industry. They aim to democratize hospitality design by innovating the design process and delivering faster, better, and more affordable solutions. Their team takes a conceptual and holistic approach, translating brands and ideas into unique places and experiences, constantly pursuing innovation and excellence.

Shah Bhogilal Jethalal & Bros.

Shah Bhogilal Jethalal & Bros is a fire protection equipment manufacturing company established in 1933 by Shri Bhogilal Jethalal Shah. Originally focused on producing brass valves for the textile industry, they diversified into fire-fighting equipment due to the declining textile industry. Today, the company, now known as AAAG, is run by the third and fourth generations of the family and continues to uphold the values of quality and reliability in their wide range of water and foam-based fire-fighting equipment, manufactured in their modern 50,000 sq. feet facility.

Kewlani Agro

Kewlani Agro, established in the 1980s as Anil Industries Pvt. Ltd., has been a pioneer in providing nutrition through top-quality grains for over four decades. With three generations of expertise and modern facilities, they've expanded into producing premium wheat flour, sooji, maida, and chokar. From their humble beginnings, they've grown to serve markets across India, making their Pukhraj and Charminar pulses a household name in every Indian kitchen.

Hectare

Hectare is a dynamic and forward-thinking agricultural technology company dedicated to revolutionizing the farming landscape. With a commitment to sustainability, innovation, and efficiency, Hectare.in offers cutting-edge solutions and services designed to empower farmers and enhance agricultural productivity. By harnessing the latest advancements in technology and data-driven insights, Hectare.in is poised to drive positive change in the agriculture sector, ensuring a brighter and more prosperous future for farmers and the global food supply chain.

Axis India

Axis India is a dynamic and forward-thinking company dedicated to providing cutting-edge solutions in the field of electrical and automation engineering. With a rich history spanning several decades, Axis India has firmly established itself as a leader in the industry. Our commitment to innovation, quality, and customer satisfaction has led us to deliver world-class products and services to a diverse range of clients across the globe.

Certifications

Get in Touch

Contact