Skip to content
Merged
Show file tree
Hide file tree
Changes from 9 commits
Commits
Show all changes
64 commits
Select commit Hold shift + click to select a range
d2873ac
Add python scripts to process obsfcstana outputs and to plot statisti…
gmao-qliu Apr 1, 2025
89ea115
fix nc4 datatype error.
gmao-qliu Apr 2, 2025
310f2c3
renamed directory (python_calc_plot_ObsFcstAna -> ObsFcstAna_stats)
gmao-rreichle Apr 7, 2025
0722776
minor update and rename easev2 script
gmao-qliu Apr 7, 2025
b35d19c
move compute_monthly_stats.py to main directory
gmao-qliu Apr 7, 2025
962c114
removed __pycache__/ files
gmao-rreichle Apr 8, 2025
c54ecbb
further reorg file locations
gmao-qliu Apr 8, 2025
c0f9bf8
fix: correct syntax error in compute_monthly_stats.py
amfox37 Apr 8, 2025
d54ab35
Merge branch 'develop' into feature/qliu/add_postproc_scripts
gmao-rreichle Apr 10, 2025
737c1eb
update and reorg functions
gmao-qliu Apr 14, 2025
3517295
add multi-experiments options
gmao-qliu Apr 16, 2025
c951034
correct definition of tile_idx
gmao-qliu Apr 17, 2025
6b4a33d
Add functions to write monthly OmF/OmA statistics to NetCDF files
amfox37 May 6, 2025
030259d
Merge branch 'feature/qliu/add_postproc_scripts' of github.com:GEOS-E…
amfox37 May 6, 2025
e72d156
Bugfix for datetime format when reading netCDF file
amfox37 May 7, 2025
e3d2339
bugfix creating/closing subsequent figures
amfox37 May 7, 2025
9fcdb3c
removed CTRL-M (^M) blue carriage return characters (EASEv2_ind2latlo…
gmao-rreichle May 12, 2025
f0e926e
some cleanup of postproc tool to compute ObsFcstAna stats (Main_examp…
gmao-rreichle May 12, 2025
ef52a11
additional cleanup of postproc tool for ObsFcstAna stats
gmao-rreichle May 13, 2025
ea1bfea
Merge branch 'develop' into feature/qliu/add_postproc_scripts
gmao-rreichle May 13, 2025
d5af23f
minimal edits (accidentally forgotten to add in previous commit) (wri…
gmao-rreichle May 13, 2025
4f7f84e
add multiple sample scripts and minor update
gmao-qliu May 15, 2025
8dd3504
remove old example script
gmao-qliu May 15, 2025
ad35e67
Merge branch 'develop' into feature/qliu/add_postproc_scripts
gmao-rreichle May 19, 2025
6165af8
cleaner separation of user-defined inputs and processing code (Save_m…
gmao-rreichle May 20, 2025
c9ef759
updated CHANGELOG.md
gmao-rreichle May 20, 2025
737e4e8
Merge branch 'develop' into feature/qliu/add_postproc_scripts
gmao-rreichle May 23, 2025
8baa1c4
fix minor typo
gmao-qliu May 27, 2025
8e2f204
fix minor missing info.
gmao-qliu May 27, 2025
3428f20
Merge branch 'develop' into feature/qliu/add_postproc_scripts
gmao-rreichle May 27, 2025
c2347f0
removed "executable" permissions from py scripts
gmao-rreichle May 27, 2025
9adeacd
removed obsolete functions from ObsFcstAna_stats/helper/write_nc4.py
gmao-rreichle May 27, 2025
83bb7e0
added documentation, fixed indent (GEOSldas_App/util/shared/python/pl…
gmao-rreichle May 27, 2025
84da80d
separate user definition, reorg functions, and add some comments
gmao-qliu May 28, 2025
1419800
remove unused imports
gmao-qliu May 29, 2025
18d67a8
Merge branch 'develop' into feature/qliu/add_postproc_scripts
gmao-rreichle May 29, 2025
d529477
changed variable name for clarity; white-space changes (tile2grid.py)
gmao-rreichle May 29, 2025
80289bf
renaming of function and variables for clarification (tile2grid.py)
gmao-rreichle May 29, 2025
454a99c
renamed function name to avoid confusion with GEOS "tile2grid" operat…
gmao-rreichle May 29, 2025
16f5ce2
rearranged order of inputs and edited comments for clariy (user_confi…
gmao-rreichle May 29, 2025
a05bd01
added/edited comments; white-space changes for better alignment (read…
gmao-rreichle May 29, 2025
55faa33
white space changes for better alignment (write_nc4.py)
gmao-rreichle May 29, 2025
39d1aac
cleaned up time variables in compute_monthly_sums() and save_monthly_…
gmao-rreichle May 29, 2025
7984912
more cleanup of time variables; edits of comments; white-space change…
gmao-rreichle May 29, 2025
7ae212a
edited comments (Save_monthlysums.py)
gmao-rreichle May 29, 2025
13da375
remove 'obs_from' and add 'use_obs'
gmao-qliu May 30, 2025
f02efb4
add print to verfiy that obs species match across exp.
gmao-qliu May 30, 2025
e1bb6c9
additional edits to 'use_obs' / 'obs_from' logic and comments (postpr…
gmao-rreichle Jun 2, 2025
d1852a8
added check to verify that obs species match across experiments (user…
gmao-rreichle Jun 2, 2025
fecb7b7
changed name of "ObsFcstAna_sums" files; edited comments (postproc_Ob…
gmao-rreichle Jun 2, 2025
5aefcfd
cleaned up documentation and changed file name of sample scripts (Get…
gmao-rreichle Jun 2, 2025
fe2b15a
cleaned up documentation of sample scripts (Get_ObsFcstAna_stats.py, …
gmao-rreichle Jun 2, 2025
ff5ff47
fix variable name error
gmao-qliu Jun 3, 2025
a28d129
add comments regard OmF_norm
gmao-qliu Jun 3, 2025
227a110
remove unavaialble variable from filename
gmao-qliu Jun 3, 2025
84c3b58
Merge branch 'develop' into feature/qliu/add_postproc_scripts
gmao-rreichle Jun 6, 2025
4b65032
clarified comments about stats of normalized OmFs (Plot_stats_maps.py)
gmao-rreichle Jun 8, 2025
3acc2a9
additional tweak to comments on normalized OmF stats (Plots_stats_map…
gmao-rreichle Jun 8, 2025
f2427fb
cleaned up obsolete if block (code in if and else block was identical…
gmao-rreichle Jun 8, 2025
5d42392
additional cleanup of stats_file name (Get_ObsFcstAna_stats.py, Plot_…
gmao-rreichle Jun 8, 2025
42591bc
shorten 'sums' filename and add exp. config verification
gmao-qliu Jun 10, 2025
57a2303
Merge branch 'develop' into feature/qliu/add_postproc_scripts
gmao-rreichle Jun 10, 2025
615ca1a
Merge branch 'develop' into feature/qliu/add_postproc_scripts
gmao-rreichle Jun 11, 2025
990bce2
cleaned up exptag[_list] and outid; added/clarified comments in ObsFc…
gmao-rreichle Jun 11, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,90 @@
import numpy as np
import sys
sys.path.append('../../shared/python/')

from datetime import datetime, timedelta
from dateutil.relativedelta import relativedelta
from read_GEOSldas import read_ObsFcstAna, read_tilecoord, read_obs_param

def compute_monthly_stats(expdir,expid,domain,this_month,tc,obs_param,var_list):

n_tile = tc['N_tile']
n_spec = len(obs_param)

start_time = this_month.replace(day=1,hour=3)
end_time = start_time + relativedelta(months=1)

data_sum = {}
data2_sum = {}

N_data = np.zeros((n_tile, n_spec))
oxf_sum = np.zeros((n_tile, n_spec))
oxa_sum = np.zeros((n_tile, n_spec))
fxa_sum = np.zeros((n_tile, n_spec))

for var in var_list:
data_sum[var] = np.zeros((n_tile, n_spec))
data2_sum[var] = np.zeros((n_tile, n_spec))

date_time = start_time
while date_time < end_time:

fname = expdir+expid+'/output/'+domain+'/ana/ens_avg/Y'+ \
date_time.strftime('%Y') + '/M' + \
date_time.strftime('%m') + '/' + \
expid+'.ens_avg.ldas_ObsFcstAna.' + \
date_time.strftime('%Y%m%d_%H%M') +'z.bin'

OFA = read_ObsFcstAna(fname)

if len(OFA['obs_tilenum']) > 0:
# Initialize full size variable to keep values
data_tile={}
for var in var_list:
data_tile[var] = np.zeros((n_tile, n_spec)) +np.nan

for ispec in np.arange(n_spec):
# check species overall "assim" flag for masking
this_species = int(obs_param[ispec]['species'])
masked_data = {}
if obs_param[ispec]['assim'] == 'T':
masked_tilenum = OFA['obs_tilenum'][np.logical_and(OFA['obs_species'] == this_species, OFA['obs_assim']==1)]
for var in var_list:
masked_data[var] = OFA[var][np.logical_and(OFA['obs_species'] == this_species, OFA['obs_assim']==1)]
else:
masked_tilenum = OFA['obs_tilenum'][OFA['obs_species'] == this_species]
for var in var_list:
masked_data[var] = OFA[var][OFA['obs_species'] == this_species]

tile_idx = np.where(np.isin(tc['tile_id'], masked_tilenum))[0]

for var in var_list:
data_tile[var][tile_idx, ispec] = masked_data[var]

is_valid = ~np.isnan(data_tile['obs_obs'])
N_data[is_valid] += 1
oxf_sum[is_valid] += data_tile['obs_obs'][is_valid] * data_tile['obs_fcst'][is_valid]
oxa_sum[is_valid] += data_tile['obs_obs'][is_valid] * data_tile['obs_ana'][is_valid]
fxa_sum[is_valid] += data_tile['obs_fcst'][is_valid] * data_tile['obs_ana'][is_valid]
for var in var_list:
data_sum[var][is_valid] += data_tile[var][is_valid]
data2_sum[var][is_valid] += data_tile[var][is_valid] **2

date_time = date_time + timedelta(seconds=10800)

return N_data, data_sum, data2_sum, oxf_sum, oxa_sum, fxa_sum

if __name__ == '__main__':
date_time = datetime(2015,5,1)
expdir = '/gpfsm/dnb05/projects/p51/SMAP_Nature/'
expid = 'SPL4SM_Vv8010'
domain = 'SMAP_EASEv2_M09_GLOBAL'
var_list = ['obs_obs', 'obs_obsvar', 'obs_fcst', 'obs_fcstvar', 'obs_ana', 'obs_anavar']
ftc = expdir+expid+'/output/'+domain+'/rc_out/'+expid+'.ldas_tilecoord.bin'
tc = read_tilecoord(ftc)

fop = expdir+expid+'/output/'+domain+'/rc_out/Y2015/M04/'+expid+'.ldas_obsparam.20150401_0000z.txt'
obs_param = read_obs_param(fop)

N_data, data_sum, data2_sum, oxf_sum, oxa_sum, fxa_sum = \
compute_monthly_stats(expdir,expid,domain,date_time,tc,obs_param,var_list)
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
from netCDF4 import Dataset
import numpy as np

def write_sums_nc4(file_path, N_data, data_sum, data2_sum, oxf_sum, oxa_sum, fxa_sum, obs_param):

nc = Dataset(file_path,'w',formate='NETCDF4')
tile = nc.createDimension('tile', N_data.shape[0])
species = nc.createDimension('species', N_data.shape[1])

data = nc.createVariable('obs_param_assim', 'c', ('species', ), zlib=True, complevel=4)
for i in range(len(obs_param)):
data[i] = obs_param[i]['assim']

data = nc.createVariable('N_data', 'i4', ('tile','species', ), zlib=True, complevel=4)
data[:,:] = N_data

data = nc.createVariable('obsxfcst_sum', 'f4', ('tile','species', ), zlib=True, complevel=4)
data[:,:] = oxf_sum

data = nc.createVariable('obsxana_sum', 'f4', ('tile','species', ), zlib=True, complevel=4)
data[:,:] = oxa_sum

data = nc.createVariable('fcstxana_sum', 'f4', ('tile','species', ), zlib=True, complevel=4)
data[:,:] = fxa_sum

for key, value in data_sum.items():
varname = key+'_sum'
data = nc.createVariable(varname,'f4',('tile','species', ), zlib=True, complevel=4)
data[:,:] = value

for key, value in data2_sum.items():
varname = key+'2_sum'
data = nc.createVariable(varname,'f4',('tile','species', ), zlib=True, complevel=4)
data [:,:]= value

nc.close()
268 changes: 268 additions & 0 deletions GEOSldas_App/util/postproc/ObsFcstAna_stats/main_ObsFcstAna.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,268 @@
#!/usr/bin/env python3

"""
GEOSldas DA Diagnostics Map Generator

This script computes and plots GEOSldas Data Assimilation (DA) diagnostics maps
based on the FcstObsAna output. It processes monthly data and generates final
metrics from these monthly statistics.

The script performs the following main tasks:
1. Loads and processes monthly FcstObsAna output data
2. Computes various DA diagnostic metrics
3. Generates maps and plots of the computed metrics
4. Saves temporary monthly statistics and final aggregated metrics

Usage:
On NASA's Discover:
$ module load python/GEOSpyD
$ ./main_ObsFcstAna.py
or to run in the background,
$ nohup ./main_ObsFcstAna.py > out.log &

Requirements:
- Python 3.x
- Modules: numpy, matplotlib, netCDF4 (included in GEOSpyD)

Author: Q. Liu
Last Modified: Apr 1, 2025
"""

import numpy as np
import os
from datetime import datetime, timedelta
from dateutil.relativedelta import relativedelta
from netCDF4 import Dataset
import matplotlib.pyplot as plt
from mpl_toolkits.basemap import Basemap
import sys
sys.path.append('../../shared/python/')

from read_GEOSldas import read_tilecoord, read_obs_param
from util import make_folder, array2grid
from plot import plotMap
from easev2 import easev2_ind2latlon
from compute_monthly_stats import compute_monthly_stats
from helper.write_sum_nc4 import write_sums_nc4

import warnings; warnings.filterwarnings("ignore")
import sys
import io

#sys.stdout = io.TextIOWrapper(open(sys.stdout.fileno(), 'wb', 0), write_through=True)
#sys.stderr = io.TextIOWrapper(open(sys.stderr.fileno(), 'wb', 0), write_through=True)

expdir = '/gpfsm/dnb05/projects/p51/SMAP_Nature/'
expid = 'SPL4SM_Vv8010'
domain = 'SMAP_EASEv2_M09_GLOBAL'

start_time = datetime(2015,4,1)
end_time = datetime(2021,4,1)

# Define a minimum threshold for the temporal data points to ensure statistical reliability
# of the computed metrics.
Nmin = 20

# Base directory for storing monthly files
# This can be the same as the experiment directory (expdir) or a different location
out_path_mo = '/discover/nobackup/qliu/SMAP_diag/' +expid+'/output/'+domain+'/ana/ens_avg/'

# Directory for diagnostic plots
out_path = '/discover/nobackup/qliu/SMAP_diag/'
make_folder(out_path)

# Variable list for computing sum and sum of squared
var_list = ['obs_obs', 'obs_obsvar','obs_fcst','obs_fcstvar','obs_ana','obs_anavar']

# Read tilecoord and obsparam for tile and obs species information
ftc = expdir+expid+'/output/'+domain+'/rc_out/'+expid+'.ldas_tilecoord.bin'
tc = read_tilecoord(ftc)
n_tile = tc['N_tile']

fop = expdir+expid+'/output/'+domain+'/rc_out/Y2015/M04/'+expid+'.ldas_obsparam.20150401_0000z.txt'
obs_param = read_obs_param(fop)
n_spec = len(obs_param)

# Initialize statistical metrics
data_sum = {}
data2_sum = {}
N_data = np.zeros((n_tile, n_spec))
oxf_sum = np.zeros((n_tile, n_spec))
oxa_sum = np.zeros((n_tile, n_spec))
fxa_sum = np.zeros((n_tile, n_spec))

for var in var_list:
data_sum[var] = np.zeros((n_tile, n_spec))
data2_sum[var] = np.zeros((n_tile, n_spec))

# Time loop: processing data at monthly time step
date_time = start_time
while date_time < end_time:
# File to store monthly statistics
fout_path = out_path_mo + '/Y'+ date_time.strftime('%Y') + '/M' + date_time.strftime('%m') + '/'
make_folder(fout_path)

fout = fout_path + expid+'.ens_avg.ldas_ObsFcstAna.' + date_time.strftime('%Y%m') +'_stats.nc4'

# Read monthly data if file exists, otherwise compute monthly statistics first
if os.path.isfile(fout):
print('read sums from monthly file: '+fout)
mdata_sum = {}
mdata2_sum = {}
with Dataset(fout,'r') as nc:
mN_data = nc.variables['N_data'][:]
moxf_sum = nc.variables['obsxfcst_sum'][:]
moxa_sum = nc.variables['obsxana_sum'][:]
mfxa_sum = nc.variables['fcstxana_sum'][:]
for var in var_list:
mdata_sum[var] = nc.variables[var+'_sum'][:]
mdata2_sum[var] = nc.variables[var+'2_sum'][:]
else:
print('compute monthly sums for '+date_time.strftime('%Y%m'))
mN_data, mdata_sum, mdata2_sum, moxf_sum, moxa_sum, mfxa_sum = \
compute_monthly_stats(expdir,expid,domain,date_time,tc,obs_param,var_list)
print('save to monthly file: '+fout)
write_sums_nc4(fout, mN_data,mdata_sum, mdata2_sum, moxf_sum, moxa_sum, mfxa_sum, obs_param)

# Aggregate monthly data
N_data += mN_data
oxf_sum += moxf_sum
oxa_sum += moxa_sum
fxa_sum += mfxa_sum

for var in var_list:
data_sum[var] += mdata_sum[var]
data2_sum[var] += mdata2_sum[var]

date_time =date_time + relativedelta(months=1)

# Compute the final statistics
# This section calculate the final statistical metrics based on the accumulated data.
data_mean ={}
data2_mean = {}
data_var = {}

# First, compute the metrics of individual variables
for var in var_list:
data_sum[var][N_data == 0] = np.nan
data2_sum[var][N_data == 0] = np.nan

data_mean[var] = data_sum[var] / N_data
data2_mean[var] = data2_sum[var] /N_data
# var(x) = E[x2] - (E[x])^2
data_var[var] = data2_mean[var] - data_mean[var]**2

oxf_sum[N_data == 0] = np.nan
oxa_sum[N_data == 0] = np.nan
fxa_sum[N_data == 0] = np.nan
# E[xy]
oxf_mean = oxf_sum / N_data
oxa_mean = oxa_sum / N_data
fxa_mean = fxa_sum / N_data

# Then computer metrics of O-F, O-A, etc. based on above computed
# mean(x-y) = E[x] - E[y]
OmF_mean = data_mean['obs_obs'] - data_mean['obs_fcst']
OmA_mean = data_mean['obs_obs'] - data_mean['obs_ana']
# var(x-y) = var(x) + var(y) - 2cov(x,y)
# cov(x,y) = E[xy] - E[x]E[y]
OmF_stdv = np.sqrt(data_var['obs_obs'] + data_var['obs_fcst'] - \
2 * (oxf_mean - data_mean['obs_obs']*data_mean['obs_fcst']))

OmA_stdv = np.sqrt(data_var['obs_obs'] + data_var['obs_ana'] - \
2 * (oxa_mean - data_mean['obs_obs']*data_mean['obs_ana']))

OmF_norm_mean = OmF_mean / np.sqrt(data_mean['obs_obsvar'] + data_mean['obs_fcstvar'])
OmF_norm_stdv = np.sqrt(OmF_stdv**2 / (data_mean['obs_obsvar'] + data_mean['obs_fcstvar']) )

# Mask out data points with insufficent observations using the Nmin threshold
# Do NOT apply to N_data
OmF_mean[N_data < Nmin] = np.nan
OmF_stdv[N_data < Nmin] = np.nan
OmF_norm_mean[N_data < Nmin] = np.nan
OmF_norm_stdv[N_data < Nmin] = np.nan
OmA_mean[N_data < Nmin] = np.nan
OmA_stdv[N_data < Nmin] = np.nan

# Combine metrics of individual species using weighted averaging
OmF_mean = np.nansum(OmF_mean*N_data, axis=1)/np.nansum(N_data,axis=1)
OmF_stdv = np.nansum(OmF_stdv*N_data,axis=1)/np.nansum(N_data,axis=1)
OmF_norm_mean = np.nansum(OmF_norm_mean*N_data, axis=1)/np.nansum(N_data,axis=1)
OmF_norm_stdv = np.nansum(OmF_norm_stdv*N_data,axis=1)/np.nansum(N_data,axis=1)
OmA_mean = np.nansum(OmA_mean*N_data, axis=1)/np.nansum(N_data,axis=1)
OmA_stdv = np.nansum(OmA_stdv*N_data,axis=1)/np.nansum(N_data,axis=1)
Nobs_data = np.nansum(N_data, axis=1)

# Plotting
fig, axes = plt.subplots(2,2, figsize=(18,10))
plt.rcParams.update({'font.size':14})

for i in np.arange(2):
for j in np.arange(2):
units = '[k]'
if i == 0 and j == 0:
tile_data = Nobs_data
# crange is [cmin, cmax]
crange =[0, np.ceil((end_time-start_time).days/150)*300]
colormap = plt.get_cmap('jet',20)
title_txt = expid + ' Tb Nobs '+ start_time.strftime('%Y%m')+'_'+end_time.strftime('%Y%m')
units = '[-]'
if i == 0 and j ==1:
tile_data = OmF_mean
crange =[-3, 3]
colormap = plt.get_cmap('bwr', 15)
title_txt = expid + ' Tb O-F mean '+ start_time.strftime('%Y%m')+'_'+end_time.strftime('%Y%m')
if i == 1 and j == 0:
tile_data = OmF_stdv
crange =[0, 15]
colormap = plt.get_cmap ('jet',15)
title_txt = expid + ' Tb O-F stdv '+ start_time.strftime('%Y%m')+'_'+end_time.strftime('%Y%m')
if i == 1 and j == 1:
tile_data = OmF_norm_stdv
crange =[0, 15]
colormap = plt.get_cmap ('jet',15)
title_txt = expid + ' Tb normalized O-F stdv '+ start_time.strftime('%Y%m%d')+'_'+end_time.strftime('%Y%m%d')

colormap.set_bad(color='0.9') # light grey, 0-black, 1-white

# Regrid 1d tile_data to 2d grid_data for map plots
if '_M09_' in domain: # special case
grid_data_M09 = np.zeros((1624, 3856)) + np.nan
grid_data_M09[tc['j_indg'],tc['i_indg']] = tile_data

# Reshape the data into 4x4 blocks
reshaped = grid_data_M09.reshape(1624//4, 4, 3856//4, 4)

# Combine each 4x4 M09 block into a M36 grid
if i==0 and j==0:
grid_data = np.sum(reshaped,axis=(1, 3))
else:
grid_data = np.nanmean(reshaped,axis=(1, 3))

lat_M36, lon_M36 = easev2_ind2latlon(np.arange(406), np.arange(964),'M36')
lon_2d,lat_2d = np.meshgrid(lon_M36,lat_M36)
else:
grid_data, uy,ux = array2grid(tile_data, lat = tc['com_lat'], lon = tc['com_lon'])
lon_2d,lat_2d = np.meshgrid(ux, uy)

if 'normalized' in title_txt:
title_txt = title_txt + '\n' + "avg=%.3f, avg(abs(nstdv-1))=%.3f" % (np.nanmean(grid_data), np.nanmean(np.abs(grid_data-1.)))+' '+units
elif 'mean' in title_txt:
title_txt = title_txt + '\n' + "avg=%.3f, avg(abs)=%.3f" % (np.nanmean(grid_data), np.nanmean(np.abs(grid_data)))+' '+units
else:
title_txt = title_txt + '\n' + "avg=%.2f" % (np.nanmean(grid_data)) +' '+units

if 'normalized' in title_txt:
grid_data = np.log10(grid_data)
crange = [-0.6, 0.45]

mm, cs = plotMap(grid_data, ax =axes[i,j], lat=lat_2d, lon=lon_2d, cRange=crange, \
title=title_txt, cmap=colormap, bounding=[-60, 80, -180,180])

plt.tight_layout()
# Save figure to file
fig.savefig(out_path+'Map_OmF_'+expid+'_'+start_time.strftime('%Y%m')+'_'+\
end_time.strftime('%Y%m')+'.png')
#plt.show()
plt.close(fig)

Loading
Loading