Description:
For this project, we developed an application to show information from 3 different sources, NASA API, a Simulator with Elastic Search and Kafka, and data from 2 websites using a web scraper. Our application features knowledge in using new technologies which collect and stores data in an Elastic Search database. This information is transferred using Kafka which can handle large amounts of data, enabling communication between our data source and database. In addition, our system includes a real-time alert component, using Socket.io to notify users of crucial Events. The front end was created as a Single Page Application (SPA) using ReactJS. The backend is written in NodeJS, in a Mirco-Services Approach and Lambda architecture.
https://www.youtube.com/watch?v=UYEudRh2xv8&ab_channel=HanotzTv
- API Service with NodeJS
- Redis
- Elastic Search
- Cloud Karafaka
Clone the project
important! you need to have the .env files in order to run the project.
git clone https://github.com/dolev146/BigData-Cloud-Computing-project.gitGo to the project directory
cd BigData-Cloud-Computing-projectSet Up Docker
note you need apple m1 if not then there are modification in the dockerfile that need to be modiefied to x86 instead of arm64
make sure you have the docker engine open in your computer then run the following commands
docker-compose build docker-compose upSet Up Scraper
navigate to the scraper folder
cd scraperinstall dependencies
npm istart the scraper
npm startNavigate to the frontend folder
cd frontendinstall dependencies
npm irun the front end
npm run devand press "o" to open the front end
| Image 1 | Image 2 | Image 3 |
|---|---|---|
![]() |
![]() |
![]() |
| Image 1 | Image 2 | Image 3 | Image 4 |
|---|---|---|---|
![]() |
![]() |
![]() |
![]() |














