Tag: technology

  • ClimaGrid, Our 2025 Code4Hope Project

    Last week, my team and I had the distinct pleasure of visiting the Microsoft building in New York as part of the finals for the code4hope hackathon! 

    I wanted to share what we accomplished through the event here (We got 2nd place!!!!), and I will link our presentation and github repository at the bottom. 

    First and foremost, the code4hope 2025 hackathon was a high school hackathon held over two rounds. For the first round, we had 3 days to develop our product and if we were selected as a finalist we would attend the final round in New York. As for prompts, we were all given a list of company briefs. Each brief detailed a fictional company facing a fictional yet plausible major problem. We were tasked with using programming to develop a creative and effective solution. 

    Here is the brief that we chose to base our product on: “GreenSpan builds futuristic cities designed to be net-zero and in harmony with nature. They integrate smart systems for water, energy, and waste, but scaling those systems while preserving livability is proving difficult. Their cities must adapt to both dense populations and changing climate patterns”.

    Our solution to the proposed problem of cities adapting sustainable systems to increasing populations and changing climate patterns is ClimaGrid, a grid-builder meant to help urban planners design more sustainable cities. 

    In the ClimaGrid interface, users are able to design a city using various custom metrics. They can pick a grid size and then fill each tile with one of four colors. Blue represents water, green represents green space, light gray represents low-population housing (such as suburbs), and dark gray represents more urban, highly-populated areas. It also takes in coordinates (latitude, longitude) of the proposed city as well as a year for projections. 

    Above is a sample output of our code. It returns heatmaps that show heat and waste distribution as well as an energy usage map that highlights areas where the most energy is demanded. There is also a projected aerial view which uses an image generation model although this was not always perfectly effective. 

    The heatmaps first take in a 2d array that represents each letter as a different value. The latitude, longitude, and year call an AWS API, which uses future climate models to estimate the temperature in that given year at peak stress (the hottest time of the year). Once this temperature is fetched, it is scaled across the heatmap depending on the type of surface that each grid square is. Cellular automata are used to disperse the heat to model real life heat-island effects. Temperature is trapped more easily in urban areas in the absence of water or green space. The pollution and energy heatmaps operate in much the same way. The projected aerial view was trained on thousands of overhead satellite images and is given the grid and told to generate noise. Given more time, this aerial view model could have been perfected.

    This is a very brief overview of our project, and you are more than welcome to check out the additional resources below which include our demo video, slideshow presentation, demo website, and complete github repo.

    Link to GitHub:

    Link to Demo Website (Only the frontend works as we had to host the backend locally): ClimaGrid

    https://github.com/JhonJhonDev/ClimaGrid

  • AI in Data Science

    AI in Data Science

    A lot of people in today’s world tend to confuse the concepts of “Artificial Intelligence” and “Data Science”, treating the former as a buzzword for the latter. However, while they are related, they are often used for different purposes. Data science is fundamentally the use of statistical tools and analysis to give meaning to a large set of data. AI itself may then use the patterns that are found in the data to create machines capable of performing tasks that would require cognitive input to do.

    While the concept of data science has existed for a very long time, it is the use of AI to enhance and implement what is derived from data which has seen immense improvement over the past few years. All neural networks and machine learning algorithms built require a training dataset. This dataset is broken down by the network until patterns are found which could be used to relate certain traits of the data to an expected output. For example, in a regression model for housing prices as a function of various factors (location, square footage, amenities, view, surrounding area, etc), patterns would be found using data analysis to teach the model how to analyze. The weightage of certain factors can be calculated using statistical analysis, but the idea that these patterns could be fed into a program to automate a task such as predicting housing prices is very much artificial intelligence. 

    Although this was a very simplified example, this is fundamentally what many of the largest companies in the world do in regards to creating models. For example, ChatGPT is a large-scale language model that uses a prompt to predict what order of words to generate to address said prompt. Examples such as these showcase how AI amplifies the capabilities of data science, making it more efficient and impactful across various domains.