Skip to content

Commit bbff9bb

Browse files
committed
NEP-18450 add db export, dashboard import
1 parent 65b3820 commit bbff9bb

File tree

15 files changed

+331
-26
lines changed

15 files changed

+331
-26
lines changed

CHANGELOG.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,10 @@
11
## Change Log
2+
3+
## 0.2.1 - 2024-09-17
4+
5+
* add Superset::Database::Export class for exporting database configurations
6+
* add Superset::Dashboard::Import class for importing a dashboards
7+
28
## 0.2.0 - 2024-08-19
39

410
* Adding RLS filter clause to the 'api/v1/security/guest_token/' API params in guest_token.rb - https://github.com/rdytech/superset-client/pull/31

doc/duplicate_dashboards.md

Lines changed: 2 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -206,9 +206,6 @@ Putting it simply, the current thinking is to delete all the replica dashboards
206206

207207
### Bringing the Duplicate Dashboard process into Superset core
208208

209-
(WIP) The goal would be to have the DuplicateDashboard process as a part of the core superset codebase.
210-
211-
To that end this Superset Improvement Proposal (SIP) .. is a starting point.
212-
213-
{add SIP request here}
209+
An ideal direction would be to have the DuplicateDashboard process as a part of the core superset codebase.
214210

211+
A Superset discussion thread has been started in [Duplicating Dashboards into a new database or schema](https://github.com/apache/superset/discussions/29899)

lib/superset/client.rb

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,13 @@ def connection
4444
@connection ||= Faraday.new(superset_host) do |f|
4545
f.authorization :Bearer, access_token
4646
f.use FaradayMiddleware::ParseJson, content_type: 'application/json'
47-
f.request :json
47+
48+
if self.config.use_json
49+
f.request :json
50+
else
51+
f.request :multipart
52+
f.request :url_encoded
53+
end
4854

4955
f.adapter :net_http
5056
end

lib/superset/dashboard/export.rb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -95,4 +95,4 @@ def datestamp
9595
end
9696
end
9797
end
98-
end
98+
end

lib/superset/dashboard/get.rb

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,7 @@ class Get < Superset::Request
44

55
attr_reader :id
66

7+
# note .. this endpoint also accepts a dashboards uuid as the identifier
78
def initialize(id)
89
@id = id
910
end

lib/superset/dashboard/import.rb

Lines changed: 63 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,63 @@
1+
# Import the provided Dashboard zip file
2+
# In the context of this API import process, assumption is that the database.yaml file details will match
3+
# an existing database in the Target Superset Environment.
4+
5+
# Scenario 1: Export from Env1 -- Import to Env1 into the SAME Environment
6+
# Will result in updating/over writing the dashboard with the contents of the zip file
7+
8+
# Scenario 2: Export from Env1 -- Import to Env2 into a DIFFERENT Environment
9+
# Assumption is that the database.yaml will match a database configuration in the target env.
10+
# Initial import will result in creating a new dashboard with the contents of the zip file.
11+
# Subsequent imports will result in updating/over writing the previous imported dashboard with the contents of the zip file.
12+
13+
# the overwrite flag will determine if the dashboard will be updated or created new
14+
15+
# Usage
16+
# Superset::Dashboard::Import.new(source_zip_file: '/tmp/dashboard.zip', overwrite: true).perform
17+
#
18+
19+
module Superset
20+
module Dashboard
21+
class Import < Request
22+
23+
attr_reader :source_zip_file, :overwrite
24+
25+
def initialize(source_zip_file: , overwrite: false)
26+
@source_zip_file = source_zip_file
27+
@overwrite = overwrite
28+
end
29+
30+
def perform
31+
validate_params
32+
33+
response
34+
end
35+
36+
def response
37+
@response ||= client(use_json: false).post(
38+
route,
39+
payload
40+
)
41+
end
42+
43+
private
44+
45+
def validate_params
46+
raise ArgumentError, 'source_zip_file is required' if source_zip_file.nil?
47+
raise ArgumentError, 'source_zip_file does not exist' unless File.exist?(source_zip_file)
48+
raise ArgumentError, 'overwrite must be a boolean' unless [true, false].include?(overwrite)
49+
end
50+
51+
def payload
52+
{
53+
formData: Faraday::UploadIO.new(source_zip_file, 'application/zip'),
54+
overwrite: overwrite.to_s
55+
}
56+
end
57+
58+
def route
59+
"dashboard/import/"
60+
end
61+
end
62+
end
63+
end

lib/superset/database/export.rb

Lines changed: 115 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,115 @@
1+
# Will export the Database zip file to /tmp/superset_database_exports with zip filename adjusted to include the database_id
2+
# Example zipfile: dashboard_#{database_id}_export_#{datestamp}.zip
3+
#
4+
# File will then be unziped and all files copied into the destination_path with the database_id as a subfolder
5+
# Optianally remove the dataset yaml files from the export
6+
#
7+
# Usage
8+
# Superset::Database::Export.new(database_id: 1, destination_path: '/tmp/superset_database_exports/').perform
9+
10+
# Superset::Database::Export.new(database_id: 1, destination_path: '/tmp/superset_database_exports/', remove_dataset_yamls: true).perform
11+
#
12+
13+
require 'superset/file_utilities'
14+
15+
module Superset
16+
module Database
17+
class Export < Request
18+
include FileUtilities
19+
20+
TMP_SUPERSET_DATABASE_PATH = '/tmp/superset_database_exports'.freeze
21+
22+
attr_reader :database_id, :destination_path, :remove_dataset_yamls
23+
24+
def initialize(database_id: , destination_path: , remove_dataset_yamls: true)
25+
@database_id = database_id
26+
@destination_path = destination_path.chomp('/')
27+
@remove_dataset_yamls = remove_dataset_yamls
28+
end
29+
30+
def perform
31+
create_tmp_dir
32+
save_exported_zip_file
33+
unzip_files
34+
copy_export_files_to_destination_path
35+
36+
Dir.glob("#{destination_path_with_db_id}/databases/*")
37+
end
38+
39+
def response
40+
@response ||= client.call(
41+
:get,
42+
client.url(route),
43+
client.param_check(params)
44+
)
45+
end
46+
47+
def exported_zip_path
48+
@exported_zip_path ||= "#{tmp_uniq_database_path}/database_#{database_id}_export_#{datestamp}.zip"
49+
end
50+
51+
private
52+
53+
def params
54+
{ "q": "!(#{database_id})" } # pulled off chrome dev tools doing a GUI export. Swagger interface not helpfull with this endpoint.
55+
end
56+
57+
def save_exported_zip_file
58+
File.open(exported_zip_path, 'wb') { |fp| fp.write(response.body) }
59+
end
60+
61+
def unzip_files
62+
@extracted_files = unzip_file(exported_zip_path, tmp_uniq_database_path)
63+
remove_dataset_yaml_files if remove_dataset_yamls
64+
end
65+
66+
def download_folder
67+
File.dirname(extracted_files[0])
68+
end
69+
70+
def destination_path_with_db_id
71+
@destination_path_with_db_id ||= File.join(destination_path, database_id.to_s)
72+
end
73+
74+
def copy_export_files_to_destination_path
75+
FileUtils.mkdir_p(destination_path_with_db_id) unless File.directory?(destination_path_with_db_id)
76+
77+
Dir.glob("#{download_folder}/*").each do |item|
78+
FileUtils.cp_r(item, destination_path_with_db_id)
79+
end
80+
end
81+
82+
def remove_dataset_yaml_files
83+
datasets_directories = Dir.glob( File.join(tmp_uniq_database_path, '/*/datasets') )
84+
85+
datasets_directories.each do |directory|
86+
FileUtils.rm_rf(directory) if Dir.exist?(directory)
87+
end
88+
end
89+
90+
def create_tmp_dir
91+
FileUtils.mkdir_p(tmp_uniq_database_path) unless File.directory?(tmp_uniq_database_path)
92+
end
93+
94+
def tmp_uniq_database_path
95+
@tmp_uniq_database_path ||= File.join(TMP_SUPERSET_DATABASE_PATH, uuid)
96+
end
97+
98+
def uuid
99+
SecureRandom.uuid
100+
end
101+
102+
def extracted_files
103+
@extracted_files ||= []
104+
end
105+
106+
def route
107+
"database/export/"
108+
end
109+
110+
def datestamp
111+
@datestamp ||= Time.now.strftime('%Y%m%d')
112+
end
113+
end
114+
end
115+
end

lib/superset/dataset/get.rb

Lines changed: 10 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -29,17 +29,6 @@ def title
2929
result['name']
3030
end
3131

32-
private
33-
34-
def route
35-
"dataset/#{id}"
36-
end
37-
38-
def display_headers
39-
%w[title schema database_name, database_id]
40-
end
41-
42-
4332
def database_name
4433
result['database']['database_name']
4534
end
@@ -51,6 +40,16 @@ def database_id
5140
def sql
5241
['sql']
5342
end
43+
44+
private
45+
46+
def route
47+
"dataset/#{id}"
48+
end
49+
50+
def display_headers
51+
%w[title schema database_name, database_id]
52+
end
5453
end
5554
end
5655
end

lib/superset/file_utilities.rb

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -9,11 +9,12 @@ def unzip_file(zip_file, destination)
99
entry_path = File.join(destination, entry.name)
1010
entries << entry_path
1111
FileUtils.mkdir_p(File.dirname(entry_path))
12-
zip.extract(entry, entry_path)
12+
13+
zip.extract(entry, entry_path) unless File.exist?(entry_path)
1314
end
1415
end
15-
puts entries
16-
entries # return array of extracted files
16+
17+
entries
1718
end
1819
end
1920
end

lib/superset/logger.rb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
module Superset
22
class Logger
3-
3+
44
def info(msg)
55
# puts msg # allow logs to console
66
logger.info msg
@@ -17,4 +17,4 @@ def logger
1717
end
1818
end
1919
end
20-
end
20+
end

0 commit comments

Comments
 (0)