Skip to content

Commit 5a6d55b

Browse files
Merge pull request #45 from suraj-webkul/datagrid-docs
Update data transfer docs.
2 parents c76c40c + b8f37f2 commit 5a6d55b

File tree

2 files changed

+114
-40
lines changed

2 files changed

+114
-40
lines changed

docs/2.1/advanced/data-transfer.md

Lines changed: 57 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -6,26 +6,71 @@ The **Data Transfer Module** allows you to import large amounts of data from CSV
66

77
### Features
88

9-
- **Queue and Non-Queue Based Import**: Supports importing via Laravel queues for background processing or direct imports for smaller datasets(Without Queue/Sync).
10-
- **CSV Data Validation**: Validate CSV data before importing to ensure data integrity.
11-
- **Validation Strategies**: Choose between different strategies to handle data errors (`Stop on Error`, `Skip Errors`).
12-
- **CSV Delimiter Customization**: Support for different CSV delimiters.
13-
- **Allowed Errors**: Configure the number of allowable errors before the process fails.
14-
- **CRUD Actions**: Supports Create, Update, and Delete operations.
9+
1. **Queue and Non-Queue Based Import**: Supports importing via Laravel queues for background processing or direct imports for smaller datasets(Without Queue/Sync).
10+
2. **CSV Data Validation**: Validate CSV data before importing to ensure data integrity.
11+
3. **Validation Strategies**: Choose between different strategies to handle data errors (`Stop on Error`, `Skip Errors`).
12+
4. **CSV Delimiter Customization**: Support for different CSV delimiters.
13+
5. **Allowed Errors**: Configure the number of allowable errors before the process fails.
14+
6. **CRUD Actions**: Supports Create, Update, and Delete operations.
1515

1616
### Usage
1717

18-
#### Importing Data
18+
### Importing Data
1919

2020
The module can import data for **Leads**, **Products**, and **Persons** entities from CSV files. You can run the import with or without Laravel's queue feature, depending on your dataset size.
2121

22-
##### Import without Queue
22+
#### 1. Import without Queue
2323

24-
If you prefer to import data without utilizing a queue system, you can achieve this by turning off the queue processing functionality.
24+
If you prefer to import data without utilizing a queue system, you can achieve this by turning off the queue processing functionality.
2525

26-
##### Import with Queue
26+
::: tip Note
27+
This is not recommended for large datasets.
28+
:::
2729

28-
If you prefer to import data utilizing a queue system, you can achieve this by turning on the queue processing functionality.
30+
#### 2. Import with Queue
31+
32+
To import data using a queue system, enable queue processing and ensure the Laravel queue worker is running on your server. For automatic handling, use Supervisor to keep the queue worker running in the background.
33+
34+
You can install Supervisor with this command:
35+
36+
```bash
37+
sudo apt-get install supervisor
38+
````
39+
40+
After installing, go to the `/etc/supervisor/conf.d` directory and create a file like `laravel-worker.conf`. Here's an example configuration:
41+
42+
```ini
43+
[program:laravel-worker]
44+
process_name=%(program_name)s_%(process_num)02d
45+
command=php /path-to-your-project/artisan queue:work --sleep=3 --tries=3 --max-time=3600
46+
autostart=true
47+
autorestart=true
48+
stopasgroup=true
49+
killasgroup=true
50+
user=your-username
51+
numprocs=8
52+
redirect_stderr=true
53+
stdout_logfile=/path-to-your-project/worker.log
54+
stopwaitsecs=3600
55+
```
56+
57+
Make sure to update the `command`, `user`, and paths (`/path-to-your-project/`) according to your server setup.
58+
59+
> **Note**: `stopwaitsecs` should be set longer than your longest running job to prevent jobs from getting killed early.
60+
61+
After creating the config, run these commands to activate it:
62+
63+
```bash
64+
sudo supervisorctl reread
65+
sudo supervisorctl update
66+
sudo supervisorctl start "laravel-worker:*"
67+
```
68+
69+
Or, if you don't want to set up Laravel Supervisor, you can manually run the queue worker using the terminal by executing the following command:
70+
71+
```bash
72+
php artisan queue:work
73+
```
2974

3075
### Validation
3176

@@ -59,16 +104,8 @@ The module supports three main actions during the import process:
59104

60105
Before finalizing the import, you can review and edit the data. The system allows you to preview the imported data and make corrections if needed.
61106

62-
### Queue Configuration
63-
64-
If you are using queues for import, make sure your Laravel queue worker is running:
65-
66-
```bash
67-
php artisan queue:work
68-
```
69-
70107
You can adjust the queue settings in the `config/queue.php` file if needed.
71108

72109
### Conclusion
73110

74-
The **Data Transfer Module** provides a robust solution for importing large datasets into your Krayin application, with flexible options for validation, error handling, and queue-based processing. Whether you're importing millions of records or just a few, this module simplifies the process while ensuring data integrity and flexibility.
111+
The **Data Transfer Module** provides a robust solution for importing large datasets into your Krayin application, with flexible options for validation, error handling, and queue-based processing. Whether you're importing millions of records or just a few, this module simplifies the process while ensuring data integrity and flexibility.

docs/master/advanced/data-transfer.md

Lines changed: 57 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -6,26 +6,71 @@ The **Data Transfer Module** allows you to import large amounts of data from CSV
66

77
### Features
88

9-
- **Queue and Non-Queue Based Import**: Supports importing via Laravel queues for background processing or direct imports for smaller datasets(Without Queue/Sync).
10-
- **CSV Data Validation**: Validate CSV data before importing to ensure data integrity.
11-
- **Validation Strategies**: Choose between different strategies to handle data errors (`Stop on Error`, `Skip Errors`).
12-
- **CSV Delimiter Customization**: Support for different CSV delimiters.
13-
- **Allowed Errors**: Configure the number of allowable errors before the process fails.
14-
- **CRUD Actions**: Supports Create, Update, and Delete operations.
9+
1. **Queue and Non-Queue Based Import**: Supports importing via Laravel queues for background processing or direct imports for smaller datasets(Without Queue/Sync).
10+
2. **CSV Data Validation**: Validate CSV data before importing to ensure data integrity.
11+
3. **Validation Strategies**: Choose between different strategies to handle data errors (`Stop on Error`, `Skip Errors`).
12+
4. **CSV Delimiter Customization**: Support for different CSV delimiters.
13+
5. **Allowed Errors**: Configure the number of allowable errors before the process fails.
14+
6. **CRUD Actions**: Supports Create, Update, and Delete operations.
1515

1616
### Usage
1717

18-
#### Importing Data
18+
### Importing Data
1919

2020
The module can import data for **Leads**, **Products**, and **Persons** entities from CSV files. You can run the import with or without Laravel's queue feature, depending on your dataset size.
2121

22-
##### Import without Queue
22+
#### 1. Import without Queue
2323

24-
If you prefer to import data without utilizing a queue system, you can achieve this by turning off the queue processing functionality.
24+
If you prefer to import data without utilizing a queue system, you can achieve this by turning off the queue processing functionality.
2525

26-
##### Import with Queue
26+
::: tip Note
27+
This is not recommended for large datasets.
28+
:::
2729

28-
If you prefer to import data utilizing a queue system, you can achieve this by turning on the queue processing functionality.
30+
#### 2. Import with Queue
31+
32+
To import data using a queue system, enable queue processing and ensure the Laravel queue worker is running on your server. For automatic handling, use Supervisor to keep the queue worker running in the background.
33+
34+
You can install Supervisor with this command:
35+
36+
```bash
37+
sudo apt-get install supervisor
38+
````
39+
40+
After installing, go to the `/etc/supervisor/conf.d` directory and create a file like `laravel-worker.conf`. Here's an example configuration:
41+
42+
```ini
43+
[program:laravel-worker]
44+
process_name=%(program_name)s_%(process_num)02d
45+
command=php /path-to-your-project/artisan queue:work --sleep=3 --tries=3 --max-time=3600
46+
autostart=true
47+
autorestart=true
48+
stopasgroup=true
49+
killasgroup=true
50+
user=your-username
51+
numprocs=8
52+
redirect_stderr=true
53+
stdout_logfile=/path-to-your-project/worker.log
54+
stopwaitsecs=3600
55+
```
56+
57+
Make sure to update the `command`, `user`, and paths (`/path-to-your-project/`) according to your server setup.
58+
59+
> **Note**: `stopwaitsecs` should be set longer than your longest running job to prevent jobs from getting killed early.
60+
61+
After creating the config, run these commands to activate it:
62+
63+
```bash
64+
sudo supervisorctl reread
65+
sudo supervisorctl update
66+
sudo supervisorctl start "laravel-worker:*"
67+
```
68+
69+
Or, if you don't want to set up Laravel Supervisor, you can manually run the queue worker using the terminal by executing the following command:
70+
71+
```bash
72+
php artisan queue:work
73+
```
2974

3075
### Validation
3176

@@ -59,16 +104,8 @@ The module supports three main actions during the import process:
59104

60105
Before finalizing the import, you can review and edit the data. The system allows you to preview the imported data and make corrections if needed.
61106

62-
### Queue Configuration
63-
64-
If you are using queues for import, make sure your Laravel queue worker is running:
65-
66-
```bash
67-
php artisan queue:work
68-
```
69-
70107
You can adjust the queue settings in the `config/queue.php` file if needed.
71108

72109
### Conclusion
73110

74-
The **Data Transfer Module** provides a robust solution for importing large datasets into your Krayin application, with flexible options for validation, error handling, and queue-based processing. Whether you're importing millions of records or just a few, this module simplifies the process while ensuring data integrity and flexibility.
111+
The **Data Transfer Module** provides a robust solution for importing large datasets into your Krayin application, with flexible options for validation, error handling, and queue-based processing. Whether you're importing millions of records or just a few, this module simplifies the process while ensuring data integrity and flexibility.

0 commit comments

Comments
 (0)