Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
47 commits
Select commit Hold shift + click to select a range
17a1b69
Initial commit
AAGAM17 Sep 14, 2024
e8d2c98
Initial commit
AAGAM17 Sep 14, 2024
af343bf
data analyzed
Dark-Knight499 Sep 15, 2024
73106ea
data analyzed
Dark-Knight499 Sep 15, 2024
8e17601
final
Dark-Knight499 Sep 15, 2024
98cbbc0
final
Dark-Knight499 Sep 15, 2024
5c607af
Add files via upload
Dark-Knight499 Sep 17, 2024
cc8904d
Add files via upload
Dark-Knight499 Sep 17, 2024
13c5c08
updated
Oct 5, 2024
250a89d
updated
Oct 5, 2024
b0cb993
folder structure
Oct 5, 2024
900cb3b
folder structure
Oct 5, 2024
470b500
Delete ai directory
AAGAM17 Oct 5, 2024
4fdfa96
Delete ai directory
AAGAM17 Oct 5, 2024
e1def5f
Merge branch 'main' of https://github.com/AAGAM17/sesimotrack
Oct 5, 2024
f193798
Merge branch 'main' of https://github.com/AAGAM17/sesimotrack
Oct 5, 2024
d67929d
Fix
Oct 5, 2024
05ff551
Fix
Oct 5, 2024
ab4bc4b
"Updated backend and frontend code: modified API routes, added CORS c…
Oct 5, 2024
bf89157
"Updated backend and frontend code: modified API routes, added CORS c…
Oct 5, 2024
8d0dfcc
Fix
Oct 5, 2024
0eafa02
Fix
Oct 5, 2024
dca32ad
Fix
Oct 5, 2024
506aa56
Fix
Oct 5, 2024
c5459e7
Routing all pages
Prasham-Karkera Oct 5, 2024
cac3bf7
Routing all pages
Prasham-Karkera Oct 5, 2024
6d8bd86
Fix
Oct 5, 2024
aea503f
Fix
Oct 5, 2024
9bb71cb
Fix
Oct 5, 2024
bf09ee3
Remove service account credentials and add to .gitignore
Oct 5, 2024
ca92ed1
Remove service account credentials and add to .gitignore
Oct 5, 2024
12499f9
Fix
Oct 5, 2024
6361796
Fix
Oct 5, 2024
e50327c
Fix
Oct 5, 2024
a92b90a
Fix
Oct 5, 2024
9d9f01e
Fix
Oct 5, 2024
2b663da
Fix
Oct 5, 2024
6612301
Update README.md
AAGAM17 Oct 6, 2024
7ac2753
Fix
Oct 6, 2024
76f80b2
Merge branch 'main' of https://github.com/AAGAM17/nasa-spaceapp-2024-…
Oct 6, 2024
848035c
Update README.md
AAGAM17 Oct 6, 2024
b628342
Update README.md
AAGAM17 Oct 6, 2024
b224f87
Merge branch 'main' of https://github.com/AAGAM17/nasa-spaceapp-2024-…
Oct 6, 2024
c430222
Add files via upload
AAGAM17 Oct 6, 2024
21ce660
Update README.md
AAGAM17 Oct 6, 2024
9bab3d4
Add files via upload
AAGAM17 Oct 6, 2024
c8a8802
Update README.md
AAGAM17 Oct 6, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
backend/seismotrack-aa80c-firebase-adminsdk-cw0zv-13c8fa7db2.json
backend/seismotrack-aa80c-firebase-adminsdk-cw0zv-13c8fa7db2.json
Binary file added Execution Plan PDF.pdf
Binary file not shown.
68 changes: 59 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,13 +1,63 @@
# NASA Space Apps Challenge 2024 [Noida]

#### Team Name -
#### Problem Statement -
#### Team Leader Email -
#### Team Name - Cosmic Hactivists
#### Problem Statement - Seismic Detection Across the Solar System
#### Team Leader Email - [email protected]

## A Brief of the Prototype:
What is your solution? and how it works.
# Seismic Signal Analysis System

## Code Execution Instruction:
*[If your solution is **not** application based, you can ignore this para]

*The Repository must contain your **Execution Plan PDF**.
## A Brief of the Prototype

### What is Your Solution?

**Solution Overview**:
The proposed solution is a **Seismic Signal Analysis System** that processes seismic data to detect significant events using advanced signal processing techniques. The system leverages machine learning algorithms to analyze time-series data, allowing users to identify potential seismic activities accurately.

**How It Works**:
1. **File Upload**: Users can upload CSV files containing seismic data, including time and velocity measurements. The application accepts files with specific formats and checks for necessary data columns.

2. **Data Processing**: Upon uploading the file, the backend server (built with Flask) processes the data. The system applies a **bandpass filter** to the velocity data to isolate significant frequency ranges relevant for seismic analysis.

3. **STA/LTA Calculation**: The system computes the **Short-Term Average/Long-Term Average (STA/LTA)** ratio to identify significant events in the seismic data. This technique helps in detecting transient seismic activities against background noise.

4. **Visualization**: After processing, the application generates a plot displaying the filtered velocity data over time, highlighting significant seismic events based on the calculated STA/LTA ratio.

5. **Result Display**: The resulting plot is returned to the user as an image, which they can view and download for further analysis.

---

## Code Execution Instructions
### Step-by-Step Execution Plan

1. **Clone the Repository**:
```bash
git clone https://github.com/AAGAM17/nasa-spaceapp-2024-submission.git
cd nasa-spaceapp-2024-submission
Set Up the Backend:

Navigate to the backend directory:
bash

cd backend
Install the required packages:
bash

pip install -r requirements.txt
Start the Flask server:
bash

python app.py
Set Up the Frontend:

Navigate to the frontend directory:
bash

cd ../frontend
Install the required packages:
bash

npm install
Start the React development server:
bash

npm run dev
5 changes: 5 additions & 0 deletions backend/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
seismotrack-aa80c-firebase-adminsdk-cw0zv-13c8fa7db2.json
seismotrack-aa80c-firebase-adminsdk-cw0zv-13c8fa7db2

.env
env
Binary file added backend/__pycache__/file_watcher.cpython-311.pyc
Binary file not shown.
267 changes: 267 additions & 0 deletions backend/app.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,267 @@
import os
import numpy as np
import matplotlib
matplotlib.use('Agg')
from flask import Flask, request, jsonify, send_file, send_from_directory
from flask import Flask, request, jsonify, send_file
from flask_cors import CORS
import pandas as pd
from sklearn.model_selection import train_test_split
from sklearn.ensemble import RandomForestRegressor
import matplotlib.pyplot as plt
from scipy.signal import spectrogram, butter, filtfilt
import firebase_admin
from firebase_admin import credentials, storage
from datetime import datetime, timedelta
from io import BytesIO

app = Flask(__name__)
CORS(app, resources={r"/*": {"origins": "*"}})

UPLOAD_FOLDER = 'uploads/'
PROCESSED_FOLDER = 'processed_data/'
ALLOWED_EXTENSIONS = {'csv'}
os.makedirs(UPLOAD_FOLDER, exist_ok=True)
os.makedirs(PROCESSED_FOLDER, exist_ok=True)

cred = credentials.Certificate('./seismotrack-aa80c-firebase-adminsdk-cw0zv-13c8fa7db2.json')
firebase_admin.initialize_app(cred, {
'storageBucket': 'seismotrack-aa80c.appspot.com'
})

bucket = storage.bucket()

os.makedirs(UPLOAD_FOLDER, exist_ok=True)
os.makedirs(PROCESSED_FOLDER, exist_ok=True)

def allowed_file(filename):
return '.' in filename and filename.rsplit('.', 1)[1].lower() in ALLOWED_EXTENSIONS

@app.route('/api/upload', methods=['POST'])
def upload_file():
if 'file' not in request.files:
return jsonify({'error': 'No file part'}), 400

file = request.files['file']

if file.filename == '':
return jsonify({'error': 'No selected file'}), 400

if file and allowed_file(file.filename):
try:
file_path = os.path.join(UPLOAD_FOLDER, file.filename)
file.save(file_path)

# Process the file with AI model and filter noise
filtered_data, spectrogram_path = process_csv(file_path)

# Generate the filename for Firebase
time_abs = datetime.now().strftime('%Y-%m-%dT%H:%M:%S.%f') # Current time in the specified format
filtered_file_name = f"filtered_data_{time_abs}.csv"
upload_to_firebase('./processed_data/filtered_data.csv', filtered_file_name)

return jsonify({
"filtered_data": filtered_data,
"spectrogram_path": f'spectrogram/{os.path.basename(spectrogram_path)}',
"download_link": filtered_file_name
}), 200

except Exception as e:
return jsonify({'error': str(e)}), 500 # Return specific error message

else:
return jsonify({'error': 'File type not allowed'}), 400

def upload_to_firebase(file_path, file_name):
try:
blob = bucket.blob(file_name)
blob.upload_from_filename(file_path)
print(f"Uploaded {file_name} to Firebase Storage.")
except Exception as e:
raise ValueError(f"Failed to upload to Firebase: {e}")

def generate_signed_url(blob_name):
blob = bucket.blob(blob_name)
url = blob.generate_signed_url(expiration=timedelta(minutes=15)) # URL valid for 15 minutes
return url

@app.route('/api/logs', methods=['GET'])
def get_logs():
blobs = bucket.list_blobs() # Fetch all blobs (files) in the storage bucket
log_data = []
for blob in blobs:
signed_url = generate_signed_url(blob.name) # Generate signed URL for each file
log_data.append({
"name": blob.name,
"time_created": blob.time_created,
"download_url": signed_url # Use signed URL instead of public URL
})
return jsonify(log_data), 200

def process_csv(file_path):
try:
seismic_data = pd.read_csv(file_path)

if 'rel_time(sec)' not in seismic_data.columns or 'velocity(c/s)' not in seismic_data.columns:
raise ValueError("CSV file is missing required columns: 'rel_time(sec)' and 'velocity(c/s)'")

X = seismic_data[['rel_time(sec)']]
y = seismic_data['velocity(c/s)']

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
model = RandomForestRegressor(n_estimators=100, random_state=42)
model.fit(X_train, y_train)

predictions = model.predict(X)

seismic_data['filtered'] = predictions

processed_file_path = os.path.join(PROCESSED_FOLDER, 'filtered_data.csv')
seismic_data.to_csv(processed_file_path, index=False)

spectrogram_path = generate_spectrogram(seismic_data)

return seismic_data.to_dict(orient='records'), spectrogram_path

except Exception as e:
raise ValueError(f"Error processing CSV file: {e}")

def generate_spectrogram(data):
# Generate the spectrogram based on your seismic data
fs = 1.0 # Sampling frequency, adjust as needed
f, t, Sxx = spectrogram(data['velocity(c/s)'], fs)

plt.figure()
plt.pcolormesh(t, f, 10 * np.log10(Sxx), shading='gouraud')
plt.ylabel('Frequency [Hz]')
plt.xlabel('Time [sec]')
plt.title('Spectrogram of Seismic Data')
plt.colorbar(label='Intensity [dB]')

# Save the spectrogram image in the processed data folder
spectrogram_path = os.path.join('processed_data', 'seismic_spectrogram.png')
plt.savefig(spectrogram_path)
plt.close() # Close the figure to free memory

return spectrogram_path

@app.route('/spectrogram/<filename>')
def serve_spectrogram(filename):
return send_from_directory('processed_data', filename)

@app.route('/api/download', methods=['GET'])
def download_filtered_data():
filtered_file_path = os.path.join(PROCESSED_FOLDER, 'filtered_data.csv')
return send_file(filtered_file_path, as_attachment=True)

def butter_bandpass(lowcut, highcut, fs, order=4):
nyquist = 0.5 * fs
low = lowcut / nyquist
high = highcut / nyquist

if low >= high or low <= 0 or high >= 1:
raise ValueError("Invalid filter frequencies. Ensure 0 < lowcut < highcut < Nyquist frequency.")

b, a = butter(order, [low, high], btype='band')
return b, a

def apply_bandpass_filter(data, lowcut, highcut, fs, order=4):
b, a = butter_bandpass(lowcut, highcut, fs, order=order)
filtered_data = filtfilt(b, a, data)
return filtered_data

def filter(df: pd.DataFrame):
time_diff = df['time_rel(sec)'].diff().mean()
fs = 1 / time_diff
print(f"Calculated Sampling Frequency: {fs} Hz")

lowcut = 0.3
highcut = 1

nyquist = 0.5 * fs
if lowcut >= nyquist or highcut >= nyquist:
print(f"Warning: Adjusting highcut and lowcut to be within Nyquist frequency ({nyquist} Hz)")
highcut = nyquist * 0.9
lowcut = min(lowcut, highcut * 0.1)

df['filtered_velocity'] = apply_bandpass_filter(df['velocity(m/s)'].values, lowcut, highcut, fs)
return df

def give_power(df: pd.DataFrame):
df["fpower"] = df["filtered_velocity"] ** 2
return df

def compute_sta_lta(signal, sta_window, lta_window):
sta = np.convolve(np.abs(signal), np.ones(sta_window)/sta_window, mode='same')
lta = np.convolve(np.abs(signal), np.ones(lta_window)/lta_window, mode='same')
return sta, lta

def process_file(file_path):
try:
df = pd.read_csv(file_path)
print("Data loaded successfully:", df.head()) # Print the first few rows

df = filter(df)
print("Data filtered successfully:", df[['time_rel(sec)', 'filtered_velocity']].head()) # Print filtered data

df = give_power(df)
print("Power calculated successfully:", df[['filtered_velocity', 'fpower']].head()) # Print power data

sta_window = 120
lta_window = 600
df['STA'], df['LTA'] = compute_sta_lta(df['filtered_velocity'], sta_window, lta_window)

df['STA/LTA_ratio'] = df['STA'] / (df['LTA'] + 1e-6) # Prevent division by zero

df.to_csv(os.path.join(PROCESSED_FOLDER, 'processed_seismic_data.csv'), index=False)
return df
except Exception as e:
print(f"Error in processing file: {e}") # Print error message for debugging
raise e # Re-raise the exception to handle it in the API


@app.route('/api/process-seismic', methods=['POST'])
def process_seismic_data():
# Check if the request contains the file part
if 'file' not in request.files:
return jsonify({'error': 'No file part in request'}), 400

file = request.files['file']

# Check if the file is actually selected
if file.filename == '':
return jsonify({'error': 'No selected file'}), 400

# Save the uploaded file
file_path = os.path.join(UPLOAD_FOLDER, file.filename)
file.save(file_path)

# Process the file
try:
df = process_file(file_path)
except Exception as e:
print(f"Error processing file: {e}") # Print the error to the console
return jsonify({'error': str(e)}), 500 # Return the error message to the client

# Find the max STA/LTA ratio and generate the plot
id_max = df["STA/LTA_ratio"].idxmax()

# Plot the seismic data and highlight the max STA/LTA event
fig, ax = plt.subplots()
ax.plot(df["time_rel(sec)"], df["velocity(m/s)"], label='Velocity')
ax.axvline(x=df["time_rel(sec)"][id_max], color='red', label='Seismic Event (Max STA/LTA)')
ax.set_xlabel('Time (s)')
ax.set_ylabel('Velocity (m/s)')
ax.legend()

# Save the plot to a BytesIO object and return it as an image
img_io = BytesIO()
plt.savefig(img_io, format='png')
plt.close()
img_io.seek(0)

return send_file(img_io, mimetype='image/png', attachment_filename='seismic_plot.png')


if __name__ == '__main__':
app.run(debug=True)
Loading