|
1 |
| -# NLP-Disaster-Tweets |
| 1 | +# NLPrescue: Intelligent Disaster Tweet Analysis System |
2 | 2 |
|
3 |
| -Predict which Tweets are about real disasters and which ones are not |
| 3 | +A machine learning project that uses Natural Language Processing (NLP) to identify and classify real disaster-related tweets from non-disaster tweets. |
4 | 4 |
|
| 5 | +## Overview |
5 | 6 |
|
6 |
| -## License |
| 7 | +TweetAlert is an intelligent system that helps emergency responders and disaster management teams quickly identify genuine disaster-related social media content. Using advanced machine learning techniques, it differentiates between tweets about actual emergencies (e.g., "Forest fire spreading near downtown!") and non-emergency tweets using similar language (e.g., "This new album is fire!"). |
7 | 8 |
|
8 |
| -[MIT](https://choosealicense.com/licenses/mit/) |
| 9 | +## Project Structure |
9 | 10 |
|
| 11 | +``` |
| 12 | +. |
| 13 | +├── .github/ # GitHub Actions workflows |
| 14 | +├── ML/ # Core ML implementation |
| 15 | +│ ├── data/ # Training and test datasets |
| 16 | +│ ├── dataset/ # Data loading and processing |
| 17 | +│ ├── helper_functions/# Utility functions |
| 18 | +│ ├── modelling/ # Model implementations |
| 19 | +│ └── predictions/ # Model outputs |
| 20 | +├── tests/ # Test suite |
| 21 | +└── wandb/ # Weights & Biases logging |
| 22 | +``` |
10 | 23 |
|
11 |
| -## Documentation |
| 24 | +## Features |
12 | 25 |
|
13 |
| -[Documentation]([https://linktodocumentation](https://www.kaggle.com/competitions/nlp-getting-started/data?select=train.csv)) |
| 26 | +- Binary classification of tweets (disaster vs non-disaster) |
| 27 | +- PyTorch-based implementation |
| 28 | +- Multiple model architectures |
| 29 | +- Weights & Biases integration for experiment tracking |
| 30 | +- Comprehensive test coverage |
| 31 | +- GPU acceleration support |
14 | 32 |
|
| 33 | +## Requirements |
15 | 34 |
|
16 |
| -## Run Locally |
| 35 | +- Python 3.7+ |
| 36 | +- PyTorch |
| 37 | +- torchvision |
| 38 | +- torchtext |
| 39 | +- pandas |
| 40 | +- numpy |
| 41 | +- scikit-learn |
| 42 | +- wandb |
| 43 | +- matplotlib |
| 44 | +- tqdm |
17 | 45 |
|
18 |
| -Clone the project |
| 46 | +## Installation |
| 47 | + |
| 48 | +1. Clone the repository |
19 | 49 |
|
20 | 50 | ```bash
|
21 |
| - git clone https://link-to-project |
| 51 | +git clone https://github.com/Programmer-RD-AI/NLP-Disaster-Tweets.git |
22 | 52 | ```
|
23 | 53 |
|
24 |
| -Go to the project directory |
| 54 | +2. Install dependencies |
25 | 55 |
|
26 | 56 | ```bash
|
27 |
| - cd my-project |
| 57 | +pip install -r requirements.txt |
28 | 58 | ```
|
29 | 59 |
|
30 |
| -Install dependencies |
| 60 | +## Usage |
31 | 61 |
|
32 |
| -```bash |
33 |
| - npm install |
| 62 | +To train the model: |
| 63 | + |
| 64 | +``` |
| 65 | +python run.py |
34 | 66 | ```
|
35 | 67 |
|
36 |
| -Start the server |
| 68 | +Monitor training progress in the Weights & Biases dashboard. |
37 | 69 |
|
38 |
| -```bash |
39 |
| - npm run start |
40 |
| -``` |
| 70 | +## Dataset |
| 71 | + |
| 72 | +The project uses two main datasets: |
| 73 | + |
| 74 | +- train.csv: Labeled tweets for training |
| 75 | +- test.csv: Unlabeled tweets for prediction |
| 76 | + |
| 77 | +Labels: |
| 78 | + |
| 79 | +- 1: Real disaster |
| 80 | +- 0: Not a real disaster |
| 81 | + |
| 82 | +## Contributing |
| 83 | + |
| 84 | +1. Fork the repository |
| 85 | +2. Create a feature branch |
| 86 | +3. Commit changes |
| 87 | +4. Push to the branch |
| 88 | +5. Open a pull request |
| 89 | + |
| 90 | +## License |
41 | 91 |
|
| 92 | +This project is licensed under the Apache License 2.0 - see the LICENSE file for details. |
0 commit comments