Skip to content

Commit bc524a5

Browse files
authored
Merge pull request #36 from rdytech/NEP-18450-database-export-dashboard-import
NEP-18450 add db export, dashboard import
2 parents 65b3820 + 2a4426e commit bc524a5

File tree

17 files changed

+339
-29
lines changed

17 files changed

+339
-29
lines changed

CHANGELOG.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,10 @@
11
## Change Log
2+
3+
## 0.2.1 - 2024-09-17
4+
5+
* add Superset::Database::Export class for exporting database configurations
6+
* add Superset::Dashboard::Import class for importing a dashboards
7+
28
## 0.2.0 - 2024-08-19
39

410
* Adding RLS filter clause to the 'api/v1/security/guest_token/' API params in guest_token.rb - https://github.com/rdytech/superset-client/pull/31

Gemfile.lock

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ GIT
1414
PATH
1515
remote: .
1616
specs:
17-
superset (0.2.0)
17+
superset (0.2.1)
1818
dotenv (~> 2.7)
1919
enumerate_it (~> 1.7.0)
2020
faraday (~> 1.0)

doc/duplicate_dashboards.md

Lines changed: 2 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -206,9 +206,6 @@ Putting it simply, the current thinking is to delete all the replica dashboards
206206

207207
### Bringing the Duplicate Dashboard process into Superset core
208208

209-
(WIP) The goal would be to have the DuplicateDashboard process as a part of the core superset codebase.
210-
211-
To that end this Superset Improvement Proposal (SIP) .. is a starting point.
212-
213-
{add SIP request here}
209+
An ideal direction would be to have the DuplicateDashboard process as a part of the core superset codebase.
214210

211+
A Superset discussion thread has been started in [Duplicating Dashboards into a new database or schema](https://github.com/apache/superset/discussions/29899)

lib/superset/client.rb

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,13 @@ def connection
4444
@connection ||= Faraday.new(superset_host) do |f|
4545
f.authorization :Bearer, access_token
4646
f.use FaradayMiddleware::ParseJson, content_type: 'application/json'
47-
f.request :json
47+
48+
if self.config.use_json
49+
f.request :json
50+
else
51+
f.request :multipart
52+
f.request :url_encoded
53+
end
4854

4955
f.adapter :net_http
5056
end

lib/superset/dashboard/export.rb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ module Dashboard
1313
class Export < Request
1414
include FileUtilities
1515

16-
TMP_SUPERSET_DASHBOARD_PATH = '/tmp/superset_dashboards'
16+
TMP_SUPERSET_DASHBOARD_PATH = '/tmp/superset_dashboard_exports'
1717

1818
attr_reader :dashboard_id, :destination_path
1919

@@ -95,4 +95,4 @@ def datestamp
9595
end
9696
end
9797
end
98-
end
98+
end

lib/superset/dashboard/get.rb

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,7 @@ class Get < Superset::Request
44

55
attr_reader :id
66

7+
# note .. this endpoint also accepts a dashboards uuid as the identifier
78
def initialize(id)
89
@id = id
910
end

lib/superset/dashboard/import.rb

Lines changed: 64 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,64 @@
1+
# Import the provided Dashboard zip file
2+
# In the context of this API import process, assumption is that the database.yaml file details will match
3+
# an existing database in the Target Superset Environment.
4+
5+
# Scenario 1: Export from Env1 -- Import to Env1 into the SAME Environment
6+
# Will result in updating/over writing the dashboard with the contents of the zip file
7+
8+
# Scenario 2: Export from Env1 -- Import to Env2 into a DIFFERENT Environment
9+
# Assumption is that the database.yaml will match a database configuration in the target env.
10+
# Initial import will result in creating a new dashboard with the contents of the zip file.
11+
# Subsequent imports will result in updating/over writing the previous imported dashboard with the contents of the zip file.
12+
13+
# the overwrite flag will determine if the dashboard will be updated or created new
14+
# overwrite: false .. will result in an error if a dashboard with the same UUID already exists
15+
16+
# Usage
17+
# Superset::Dashboard::Import.new(source_zip_file: '/tmp/dashboard.zip').perform
18+
#
19+
20+
module Superset
21+
module Dashboard
22+
class Import < Request
23+
24+
attr_reader :source_zip_file, :overwrite
25+
26+
def initialize(source_zip_file: , overwrite: true)
27+
@source_zip_file = source_zip_file
28+
@overwrite = overwrite
29+
end
30+
31+
def perform
32+
validate_params
33+
34+
response
35+
end
36+
37+
def response
38+
@response ||= client(use_json: false).post(
39+
route,
40+
payload
41+
)
42+
end
43+
44+
private
45+
46+
def validate_params
47+
raise ArgumentError, 'source_zip_file is required' if source_zip_file.nil?
48+
raise ArgumentError, 'source_zip_file does not exist' unless File.exist?(source_zip_file)
49+
raise ArgumentError, 'overwrite must be a boolean' unless [true, false].include?(overwrite)
50+
end
51+
52+
def payload
53+
{
54+
formData: Faraday::UploadIO.new(source_zip_file, 'application/zip'),
55+
overwrite: overwrite.to_s
56+
}
57+
end
58+
59+
def route
60+
"dashboard/import/"
61+
end
62+
end
63+
end
64+
end

lib/superset/database/export.rb

Lines changed: 119 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,119 @@
1+
# Will export the Database zip file to /tmp/superset_database_exports with zip filename adjusted to include the database_id
2+
# Example zipfile: dashboard_#{database_id}_export_#{datestamp}.zip
3+
#
4+
# File will then be unziped and all files copied into the destination_path with the database_id as a subfolder
5+
# Optianally remove the dataset yaml files from the export
6+
#
7+
# Usage
8+
# Superset::Database::Export.new(database_id: 1, destination_path: '/tmp/superset_database_exports/').perform
9+
10+
# Superset::Database::Export.new(database_id: 1, destination_path: '/tmp/superset_database_exports/', remove_dataset_yamls: true).perform
11+
#
12+
13+
require 'superset/file_utilities'
14+
15+
module Superset
16+
module Database
17+
class Export < Request
18+
include FileUtilities
19+
20+
TMP_SUPERSET_DATABASE_PATH = '/tmp/superset_database_exports'.freeze
21+
22+
attr_reader :database_id, :destination_path, :remove_dataset_yamls
23+
24+
def initialize(database_id: , destination_path: , remove_dataset_yamls: true)
25+
@database_id = database_id
26+
@destination_path = destination_path.chomp('/')
27+
@remove_dataset_yamls = remove_dataset_yamls
28+
end
29+
30+
def perform
31+
create_tmp_dir
32+
save_exported_zip_file
33+
unzip_files
34+
copy_export_files_to_destination_path
35+
36+
Dir.glob("#{destination_path_with_db_id}/databases/*")
37+
end
38+
39+
def response
40+
@response ||= client.call(
41+
:get,
42+
client.url(route),
43+
client.param_check(params)
44+
)
45+
end
46+
47+
def exported_zip_path
48+
@exported_zip_path ||= "#{tmp_uniq_database_path}/database_#{database_id}_export_#{datestamp}.zip"
49+
end
50+
51+
private
52+
53+
def params
54+
# The Swagger API interface indicates this endpoint should take an array of integers
55+
# however this does not work within the Swagger interface or when testing the API
56+
# Investigating the Superset GUI with Dev Tools shows that the format below is used
57+
58+
{ "q": "!(#{database_id})" }
59+
end
60+
61+
def save_exported_zip_file
62+
File.open(exported_zip_path, 'wb') { |fp| fp.write(response.body) }
63+
end
64+
65+
def unzip_files
66+
@extracted_files = unzip_file(exported_zip_path, tmp_uniq_database_path)
67+
remove_dataset_yaml_files if remove_dataset_yamls
68+
end
69+
70+
def download_folder
71+
File.dirname(extracted_files[0])
72+
end
73+
74+
def destination_path_with_db_id
75+
@destination_path_with_db_id ||= File.join(destination_path, database_id.to_s)
76+
end
77+
78+
def copy_export_files_to_destination_path
79+
FileUtils.mkdir_p(destination_path_with_db_id) unless File.directory?(destination_path_with_db_id)
80+
81+
Dir.glob("#{download_folder}/*").each do |item|
82+
FileUtils.cp_r(item, destination_path_with_db_id)
83+
end
84+
end
85+
86+
def remove_dataset_yaml_files
87+
datasets_directories = Dir.glob( File.join(tmp_uniq_database_path, '/*/datasets') )
88+
89+
datasets_directories.each do |directory|
90+
FileUtils.rm_rf(directory) if Dir.exist?(directory)
91+
end
92+
end
93+
94+
def create_tmp_dir
95+
FileUtils.mkdir_p(tmp_uniq_database_path) unless File.directory?(tmp_uniq_database_path)
96+
end
97+
98+
def tmp_uniq_database_path
99+
@tmp_uniq_database_path ||= File.join(TMP_SUPERSET_DATABASE_PATH, uuid)
100+
end
101+
102+
def uuid
103+
SecureRandom.uuid
104+
end
105+
106+
def extracted_files
107+
@extracted_files ||= []
108+
end
109+
110+
def route
111+
"database/export/"
112+
end
113+
114+
def datestamp
115+
@datestamp ||= Time.now.strftime('%Y%m%d')
116+
end
117+
end
118+
end
119+
end

lib/superset/dataset/get.rb

Lines changed: 10 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -29,17 +29,6 @@ def title
2929
result['name']
3030
end
3131

32-
private
33-
34-
def route
35-
"dataset/#{id}"
36-
end
37-
38-
def display_headers
39-
%w[title schema database_name, database_id]
40-
end
41-
42-
4332
def database_name
4433
result['database']['database_name']
4534
end
@@ -51,6 +40,16 @@ def database_id
5140
def sql
5241
['sql']
5342
end
43+
44+
private
45+
46+
def route
47+
"dataset/#{id}"
48+
end
49+
50+
def display_headers
51+
%w[title schema database_name, database_id]
52+
end
5453
end
5554
end
5655
end

lib/superset/file_utilities.rb

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -9,11 +9,12 @@ def unzip_file(zip_file, destination)
99
entry_path = File.join(destination, entry.name)
1010
entries << entry_path
1111
FileUtils.mkdir_p(File.dirname(entry_path))
12-
zip.extract(entry, entry_path)
12+
13+
zip.extract(entry, entry_path) unless File.exist?(entry_path)
1314
end
1415
end
15-
puts entries
16-
entries # return array of extracted files
16+
17+
entries
1718
end
1819
end
1920
end

0 commit comments

Comments
 (0)