How to have the xlim with seaborn automatically adjust based on dataframe date range - loops

I am trying to loop through plots. Each "station" is a pandas dataframe has a single water year of data (oct 1 to Spet 29). The data is being read in with this code:
sh_784_2020 = pd.read_csv("sh_784_WY2020.csv", parse_dates=['Date'])
sh_784_2020.columns = ["Index", "Date", "Temp_C","Precip_mm","SnowDepth_cm","SWE_mm","SM2","SM8","SM20"]
My plots loop through but the x-axis always starts at the year 2000 through the current date displayed but my data is from 2006-2020. Is there a way to have the xlim adjust automatically for the date range in the data frame? Or is there a way to create this plot in matyplotlib and not seaborn?
for station in stations:
station['Density'] = station['SWE_mm']/(station['SnowDepth_cm']*10)*100
station['Density range'] = pd.cut( station['Density'], [-np.inf, 25, 30, 35, 40, np.inf])
Date = station.loc[:, 'Date'].values
SWE_mm = station.loc[:, 'SWE_mm'].values
Density = station.loc[:, 'Density'].values
sns.scatterplot(station['Date'], station['SWE_mm'], hue='Density range', data= station, edgecolor = 'none', palette=['grey', 'green', 'gold', 'orange', 'crimson'], alpha= 1)
plt.xlim ()
plt.show()
Plot example 1
Plot example 2

If you upgrade to seaborn 0.11 you should find that the default autoscaling works better, but you can get a good result without upgrading by creating the Axes object before plotting and setting the units, e.g. something like
ax = plt.figure().subplots()
ax.xaxis.update_units(station["Date"])

Related

Plotting many pie charts using a loop to create a single figure using matplotlib

I'm having trouble converting a script I wrote to create and save 15 pie charts separately which I would like to save as a single figure with 15 subplots instead. I have tried taking fig, ax = plt.subplots(5, 3, figsize=(7, 7)) out of the loop and specifying the number of rows and columns for the plot but I get this error AttributeError: 'numpy.ndarray' object has no attribute 'pie'. This error doesn't occur if I leave that bit of code in the script as is seen below. Any help with tweaking the code below to create a single figure with 15 subplots (one for each site) would be enormously appreciated.
import pandas as pd
import matplotlib.pyplot as plt
df = pd.read_excel(path)
df_1 = df.groupby(['Site', 'group'])['Abundance'].sum().reset_index(name='site_count')
site = ['Ireland', 'England', 'France', 'Scotland', 'Italy', 'Spain',
'Croatia', 'Sweden', 'Denmark', 'Germany', 'Belgium', 'Austria', 'Poland', 'Stearman', 'Hungary']
for i in site:
df_1b = df_1.loc[df_1['Site'] == i]
colors = {'Dog': 'orange', 'Cat': 'cyan', 'Pig': 'darkred', 'Horse': 'lightcoral', 'Bird':
'grey', 'Rat': 'lightsteelblue', 'Whale': 'teal', 'Fish': 'plum', 'Shark': 'darkgreen'}
wp = {'linewidth': 1, 'edgecolor': "black"}
fig, ax = plt.subplots(figsize=(7, 7))
texts, autotexts = ax.pie(df_1b['site_count'],
labels=None,
shadow=False,
colors=[colors[key] for key in labels],
startangle=90,
wedgeprops=wp,
textprops=dict(color="black"))
plt.setp(autotexts, size=16)
ax.set_title(site, size=16, weight="bold", y=0)
plt.savefig('%s_group_diversity.png' % i, bbox_inches='tight', pad_inches=0.05, dpi=600)
It's hard to guess how exactly you'd like the plot to look like.
The main changes the code below makes, are:
adding fig, axs = plt.subplots(nrows=5, ncols=3, figsize=(12, 18)). Here axs is a 2d array of subplots. figsize should be large enough to fit the 15 subplots.
df_1b['group'] is used for the labels that decide the color (it's not clear where the labels themselves should be shown, maybe in a common legend)
autopct='%.1f%%' is added to ax.pie(...). This shows the percentages with one decimal.
With autopct, ax.pie(...) now returns 3 lists: texts, autotexts, wedges. The texts are the text objects for the labels (currently empty texts), autotexts are the percentages (that are calculated "automatically"), wedges are the triangular wedges.
ax.set_title now uses the site name, and puts it at a negative y-value (y=0 would overlap with the pie)
plt.tight_layout() at the end tries to optimize the surrounding white space
import matplotlib.pyplot as plt
import pandas as pd
import numpy as np
site = ['Ireland', 'England', 'France', 'Scotland', 'Italy', 'Spain',
'Croatia', 'Sweden', 'Denmark', 'Germany', 'Belgium', 'Austria', 'Poland', 'Stearman', 'Hungary']
colors = {'Dog': 'orange', 'Cat': 'cyan', 'Pig': 'darkred', 'Horse': 'lightcoral', 'Bird': 'grey',
'Rat': 'lightsteelblue', 'Whale': 'teal', 'Fish': 'plum', 'Shark': 'darkgreen'}
wedge_properties = {'linewidth': 1, 'edgecolor': "black"}
# create some dummy test data
df = pd.DataFrame({'Site': np.random.choice(site, 1000),
'group': np.random.choice(list(colors.keys()), 1000),
'Abundance': np.random.randint(1, 11, 1000)})
df_1 = df.groupby(['Site', 'group'])['Abundance'].sum().reset_index(name='site_count')
fig, axs = plt.subplots(nrows=5, ncols=3, figsize=(12, 18))
for site_i, ax in zip(site, axs.flat):
df_1b = df_1[df_1['Site'] == site_i]
labels = df_1b['group']
texts, autotexts, wedges = ax.pie(df_1b['site_count'],
labels=None,
shadow=False,
colors=[colors[key] for key in labels],
startangle=90,
wedgeprops=wedge_properties,
textprops=dict(color="black"),
autopct='%.1f%%')
plt.setp(autotexts, size=10)
ax.set_title(site_i, size=16, weight="bold", y=-0.05)
plt.tight_layout()
plt.savefig('group_diversity.png', bbox_inches='tight', pad_inches=0.05, dpi=600)
plt.show()

How to forecast unknown future target values with gluonts DeepAR?

How to forecast unknown future target values with gluonts DeepAR?
I have a time series from 1995-01-01 to 2021-10-01. Monthly frequency data. How to forecast values for the future (next 3 months): 2021-11-01 to 2022-01-01? Note that I don't have the target values for 2021-11-01, 2021-12-01 and 2022-01-01.
Many thanks!
from gluonts.model.deepar import DeepAREstimator
from gluonts.mx import Trainer
import numpy as np
import mxnet as mx
np.random.seed(7)
mx.random.seed(7)
estimator = DeepAREstimator(
prediction_length=12
, context_length=120
, freq='M'
, trainer=Trainer(
epochs=5
, learning_rate=1e-03
, num_batches_per_epoch=50))
predictor = estimator.train(training_data=df_train)
# Forecasting
predictions = predictor.predict(df_test)
predictions = list(predictions)[0]
predictions = predictions.quantile(0.5)
print(predictions)
[163842.34 152805.08 161326.3 176823.97 127003.79 126937.78
139575.2 117121.67 115754.67 139211.28 122623.586 120102.65 ]
As I understood, the predictions values are not for "2021-11-01", "2021-12-01" and "2022-01-01". How do I know to which months this values refer to? How to forecast values for the next 3 months: "2021-11-01", "2021-12-01" and "2022-01-01"?
Take a look at this code. It comes from "Advanced Forecasting with Python".
https://github.com/Apress/advanced-forecasting-python/blob/main/Chapter%2020%20-%20Amazon's%20DeepAR.ipynb
It does not seem to forecast unknown future values, once it compares the last 28 values of test_ds (Listing 20-5. R2 score and prediction graph) with the predictions made over this same dataset test_ds (Listing 20-4. Prediction)
How do I forecast unknown future values?
Many thanks!
Data source
https://www.kaggle.com/c/recruit-restaurant-visitor-forecasting
# Listing 20-1. Importing the data
import pandas as pd
y = pd.read_csv('air_visit_data.csv.zip')
y = y.pivot(index='visit_date', columns='air_store_id')['visitors']
y = y.fillna(0)
y = pd.DataFrame(y.sum(axis=1))
y = y.reset_index(drop=False)
y.columns = ['date', 'y']
# Listing 20-2. Preparing the data format requered by the gluonts library
from gluonts.dataset.common import ListDataset
start = pd.Timestamp("01-01-2016", freq="H")
# train dataset: cut the last window of length "prediction_length", add "target" and "start" fields
train_ds = ListDataset([{'target': y.loc[:450,'y'], 'start': start}], freq='H')
# test dataset: use the whole dataset, add "target" and "start" fields
test_ds = ListDataset([{'target': y['y'], 'start': start}],freq='H')
# Listing 20-3. Fitting the default DeepAR model
from gluonts.model.deepar import DeepAREstimator
from gluonts.trainer import Trainer
import mxnet as mx
import numpy as np
np.random.seed(7)
mx.random.seed(7)
estimator = DeepAREstimator(
prediction_length=28,
context_length=100,
freq=’H’,
trainer=Trainer(ctx="gpu", # remove if running on windows
epochs=5,
learning_rate=1e-3,
num_batches_per_epoch=100
)
)
predictor = estimator.train(train_ds)
# Listing 20-4. Prediction
predictions = predictor.predict(test_ds)
predictions = list(predictions)[0]
predictions = predictions.quantile(0.5)
# Listing 20-5. R2 score and prediction graph
from sklearn.metrics import r2_score
print(r2_score( list(test_ds)[0]['target'][-28:], predictions))
import matplotlib.pyplot as plt
plt.plot(predictions)
plt.plot(list(test_ds)[0]['target'][-28:])
plt.legend(['predictions', 'actuals'])
plt.show()
In your case the context length is 120 and prediction length is 12 so the model will look behind 120 data points to predict 12 future data points
The recommendation is to reduce the context to may be 10 and include the data from past 10 months in the df_test table
you can get the start of the forecast using
list(predictor.predict(df_test))[0].start_date
based on this create a future table of 12 dates(as 12 is the prediction length)

Increase speed creation for masked xarray file

I am currently trying to crop a retangular xarray file to the shape of a country using a mask grid. Below you can find my current solution (with simpler and smaller arrays). The code works and I get the desired mask based on 1s and 0s. The problem lies on the fact that the code when run on a real country shape (larger and more complex) takes over 30 minutes to run. Since I am using very basic operations here like nested for loops, I also tried different alternatives like a list approach. However, when timing the process, it did not improve on the code below. I wonder if there is a faster way to obtain this mask (vectorization?) or if I should approach the problem in a different way (tried exploring xarray's properties, but have not found anything that tackles this issue yet).
Code below:
import geopandas as gpd
from shapely.geometry import Polygon, Point
import pandas as pd
import numpy as np
import xarray as xr
df = pd.read_csv('Brazil_borders.csv',index_col=0)
lats = np.array([-20, -5, -5, -20,])
lons = np.array([-60, -60, -30, -30])
lats2 = np.array([-10.25, -10.75, -11.25, -11.75, -12.25, -12.75, -13.25, -13.75,
-14.25, -14.75, -15.25, -15.75, -16.25, -16.75, -17.25, -17.75,
-18.25, -18.75, -19.25, -19.75, -20.25, -20.75, -21.25, -21.75,
-22.25, -22.75, -23.25, -23.75, -24.25, -24.75, -25.25, -25.75,
-26.25, -26.75, -27.25, -27.75, -28.25, -28.75, -29.25, -29.75,
-30.25, -30.75, -31.25, -31.75, -32.25, -32.75])
lons2 = np.array([-61.75, -61.25, -60.75, -60.25, -59.75, -59.25, -58.75, -58.25,
-57.75, -57.25, -56.75, -56.25, -55.75, -55.25, -54.75, -54.25,
-53.75, -53.25, -52.75, -52.25, -51.75, -51.25, -50.75, -50.25,
-49.75, -49.25, -48.75, -48.25, -47.75, -47.25, -46.75, -46.25,
-45.75, -45.25, -44.75, -44.25])
points = []
for i in range(len(lats)):
_= [lats[i],lons[i]]
points.append(_)
poly_proj = Polygon(points)
mask = np.zeros((len(lats2),len(lons2))) # Mask with the dataset's shape and size.
for i in range(len(lats2)): # Iteration to verify if a given coordinate is within the polygon's area
for j in range(len(lons2)):
grid_point = Point(lats2[i], lons2[j])
if grid_point.within(poly_proj):
mask[i][j] = 1
bool_final = mask
bool_final
The alternative based on list approach, but with even worse processing time (according to timeit):
lats = np.array([-20, -5, -5, -20,])
lons = np.array([-60, -60, -30, -30])
lats2 = np.array([-10.25, -10.75, -11.25, -11.75, -12.25, -12.75, -13.25, -13.75,
-14.25, -14.75, -15.25, -15.75, -16.25, -16.75, -17.25, -17.75,
-18.25, -18.75, -19.25, -19.75, -20.25, -20.75, -21.25, -21.75,
-22.25, -22.75, -23.25, -23.75, -24.25, -24.75, -25.25, -25.75,
-26.25, -26.75, -27.25, -27.75, -28.25, -28.75, -29.25, -29.75,
-30.25, -30.75, -31.25, -31.75, -32.25, -32.75])
lons2 = np.array([-61.75, -61.25, -60.75, -60.25, -59.75, -59.25, -58.75, -58.25,
-57.75, -57.25, -56.75, -56.25, -55.75, -55.25, -54.75, -54.25,
-53.75, -53.25, -52.75, -52.25, -51.75, -51.25, -50.75, -50.25,
-49.75, -49.25, -48.75, -48.25, -47.75, -47.25, -46.75, -46.25,
-45.75, -45.25, -44.75, -44.25])
points = []
for i in range(len(lats)):
_= [lats[i],lons[i]]
points.append(_)
poly_proj = Polygon(points)
grid_point = [Point(lats2[i],lons2[j]) for i in range(len(lats2)) for j in range(len(lons2))]
mask = [1 if grid_point[i].within(poly_proj) else 0 for i in range(len(grid_point))]
bool_final2 = np.reshape(mask,(((len(lats2)),(len(lons2)))))
Thank you in advance!
Based on this answer from snowman2, I created this simple function that provides a much faster solution by using geopandas and rioxarray. Instead of using a list of latitudes and longitudes, one has to use a shapefile with the desired shape to be masked (Instructions for GeoDataFrame creation from list of coordinates).
import xarray as xr
import geopandas as gpd
import rioxarray
from shapely.geometry import mapping
def mask_shape_border (DS,shape_shp): #Inputs are the dataset to be cropped and the address of the mask file (.shp )
if 'lat' in DS: #Some datasets use lat/lon, others latitude/longitude
DS.rio.set_spatial_dims(x_dim="lon", y_dim="lat", inplace=True)
elif 'latitude' in DS:
DS.rio.set_spatial_dims(x_dim="longitude", y_dim="latitude", inplace=True)
else:
print("Error: check latitude and longitude variable names.")
DS.rio.write_crs("epsg:4326", inplace=True)
mask = gpd.read_file(shape_shp, crs="epsg:4326")
DS_clipped = DS.rio.clip(mask.geometry.apply(mapping), mask.crs, drop=False)
return(DS_clipped)

Interpolating GFS winds from isobaric to height coordinates using Metpy

I have been tasked with making plots of winds at various levels of the atmosphere to support aviation. While I have been able to make some nice plots using GFS model data (see code below), I'm really having to make a rough approximation of height using pressure coordinates available from the GFS. I'm using winds at 300 hPA, 700 hPA, and 925 hPA to make an approximation of the winds at 30,000 ft, 9000 ft, and 3000 ft. My question is really for those out there who are metpy gurus...is there a way that I can interpolate these winds to a height surface? It sure would be nice to get the actual winds at these height levels! Thanks for any light anyone can share on this subject!
import cartopy.crs as ccrs
import cartopy.feature as cfeature
import matplotlib.pyplot as plt
from matplotlib.colors import ListedColormap
import numpy as np
from netCDF4 import num2date
from datetime import datetime, timedelta
from siphon.catalog import TDSCatalog
from siphon.ncss import NCSS
from PIL import Image
from matplotlib import cm
# For the vertical levels we want to grab with our queries
# Levels need to be in Pa not hPa
Levels = [30000,70000,92500]
# Time deltas for days
Deltas = [1,2,3]
#Deltas = [1]
# Levels in hPa for the file names
LevelDict = {30000:'300', 70000:'700', 92500:'925'}
# The path to where our banners are stored
impath = 'C:\\Users\\shell\\Documents\\Python Scripts\\Banners\\'
# Final images saved here
imoutpath = 'C:\\Users\\shell\\Documents\\Python Scripts\\TVImages\\'
# Quick function for finding out which variable is the time variable in the
# netCDF files
def find_time_var(var, time_basename='time'):
for coord_name in var.coordinates.split():
if coord_name.startswith(time_basename):
return coord_name
raise ValueError('No time variable found for ' + var.name)
# Function to grab data at different levels from Siphon
def grabData(level):
query.var = set()
query.variables('u-component_of_wind_isobaric', 'v-component_of_wind_isobaric')
query.vertical_level(level)
data = ncss.get_data(query)
u_wind_var = data.variables['u-component_of_wind_isobaric']
v_wind_var = data.variables['v-component_of_wind_isobaric']
time_var = data.variables[find_time_var(u_wind_var)]
lat_var = data.variables['lat']
lon_var = data.variables['lon']
return u_wind_var, v_wind_var, time_var, lat_var, lon_var
# Construct a TDSCatalog instance pointing to the gfs dataset
best_gfs = TDSCatalog('http://thredds-jetstream.unidata.ucar.edu/thredds/catalog/grib/'
'NCEP/GFS/Global_0p5deg/catalog.xml')
# Pull out the dataset you want to use and look at the access URLs
best_ds = list(best_gfs.datasets.values())[1]
#print(best_ds.access_urls)
# Create NCSS object to access the NetcdfSubset
ncss = NCSS(best_ds.access_urls['NetcdfSubset'])
print(best_ds.access_urls['NetcdfSubset'])
# Looping through the forecast times
for delta in Deltas:
# Create lat/lon box and the time(s) for location you want to get data for
now = datetime.utcnow()
fcst = now + timedelta(days = delta)
timestamp = datetime.strftime(fcst, '%A')
query = ncss.query()
query.lonlat_box(north=78, south=45, east=-90, west=-220).time(fcst)
query.accept('netcdf4')
# Now looping through the levels to create our plots
for level in Levels:
u_wind_var, v_wind_var, time_var, lat_var, lon_var = grabData(level)
# Get actual data values and remove any size 1 dimensions
lat = lat_var[:].squeeze()
lon = lon_var[:].squeeze()
u_wind = u_wind_var[:].squeeze()
v_wind = v_wind_var[:].squeeze()
#converting to knots
u_windkt= u_wind*1.94384
v_windkt= v_wind*1.94384
wspd = np.sqrt(np.power(u_windkt,2)+np.power(v_windkt,2))
# Convert number of hours since the reference time into an actual date
time = num2date(time_var[:].squeeze(), time_var.units)
print (time)
# Combine 1D latitude and longitudes into a 2D grid of locations
lon_2d, lat_2d = np.meshgrid(lon, lat)
# Create new figure
#fig = plt.figure(figsize = (18,12))
fig = plt.figure()
fig.set_size_inches(26.67,15)
datacrs = ccrs.PlateCarree()
plotcrs = ccrs.LambertConformal(central_longitude=-150,
central_latitude=55,
standard_parallels=(30, 60))
# Add the map and set the extent
ax = plt.axes(projection=plotcrs)
ext = ax.set_extent([-195., -115., 50., 72.],datacrs)
ext2 = ax.set_aspect('auto')
ax.background_patch.set_fill(False)
# Add state boundaries to plot
ax.add_feature(cfeature.STATES, edgecolor='black', linewidth=2)
# Add geopolitical boundaries for map reference
ax.add_feature(cfeature.COASTLINE.with_scale('50m'))
ax.add_feature(cfeature.OCEAN.with_scale('50m'))
ax.add_feature(cfeature.LAND.with_scale('50m'),facecolor = '#cc9666', linewidth = 4)
if level == 30000:
spdrng_sped = np.arange(30, 190, 2)
windlvl = 'Jet_Stream'
elif level == 70000:
spdrng_sped = np.arange(20, 100, 1)
windlvl = '9000_Winds_Aloft'
elif level == 92500:
spdrng_sped = np.arange(20, 80, 1)
windlvl = '3000_Winds_Aloft'
else:
pass
top = cm.get_cmap('Greens')
middle = cm.get_cmap('YlOrRd')
bottom = cm.get_cmap('BuPu_r')
newcolors = np.vstack((top(np.linspace(0, 1, 128)),
middle(np.linspace(0, 1, 128))))
newcolors2 = np.vstack((newcolors,bottom(np.linspace(0,1,128))))
cmap = ListedColormap(newcolors2)
cf = ax.contourf(lon_2d, lat_2d, wspd, spdrng_sped, cmap=cmap,
transform=datacrs, extend = 'max', alpha=0.75)
cbar = plt.colorbar(cf, orientation='horizontal', pad=0, aspect=50,
drawedges = 'true')
cbar.ax.tick_params(labelsize=16)
wslice = slice(1, None, 4)
ax.quiver(lon_2d[wslice, wslice], lat_2d[wslice, wslice],
u_windkt[wslice, wslice], v_windkt[wslice, wslice], width=0.0015,
headlength=4, headwidth=3, angles='xy', color='black', transform = datacrs)
plt.savefig(imoutpath+'TV_UpperAir'+LevelDict[level]+'_'+timestamp+'.png',bbox_inches= 'tight')
# Now we use Pillow to overlay the banner with the appropriate day
background = Image.open(imoutpath+'TV_UpperAir'+LevelDict[level]+'_'+timestamp+'.png')
im = Image.open(impath+'Banner_'+windlvl+'_'+timestamp+'.png')
# resize the image
size = background.size
im = im.resize(size,Image.ANTIALIAS)
background.paste(im, (17, 8), im)
background.save(imoutpath+'TV_UpperAir'+LevelDict[level]+'_'+timestamp+'.png','PNG')
Thanks for the question! My approach here is for each separate column to interpolate the pressure coordinate of GFS-output Geopotential Height onto your provided altitudes to estimate the pressure of each height level for each column. Then I can use that pressure to interpolate the GFS-output u, v onto. The GFS-output GPH and winds have very slightly different pressure coordinates, which is why I interpolated twice. I performed the interpolation using MetPy's interpolate.log_interpolate_1d which performs a linear interpolation on the log of the inputs. Here is the code I used!
from datetime import datetime
import numpy as np
import metpy.calc as mpcalc
from metpy.units import units
from metpy.interpolate import log_interpolate_1d
from siphon.catalog import TDSCatalog
gfs_url = 'https://tds.scigw.unidata.ucar.edu/thredds/catalog/grib/NCEP/GFS/Global_0p5deg/catalog.xml'
cat = TDSCatalog(gfs_url)
now = datetime.utcnow()
# A shortcut to NCSS
ncss = cat.datasets['Best GFS Half Degree Forecast Time Series'].subset()
query = ncss.query()
query.var = set()
query.variables('u-component_of_wind_isobaric', 'v-component_of_wind_isobaric', 'Geopotential_height_isobaric')
query.lonlat_box(north=78, south=45, east=-90, west=-220)
query.time(now)
query.accept('netcdf4')
data = ncss.get_data(query)
# Reading in the u(isobaric), v(isobaric), isobaric vars and the GPH(isobaric6) and isobaric6 vars
# These are two slightly different vertical pressure coordinates.
# We will also assign units here, and this can allow us to go ahead and convert to knots
lat = units.Quantity(data.variables['lat'][:].squeeze(), units('degrees'))
lon = units.Quantity(data.variables['lon'][:].squeeze(), units('degrees'))
iso_wind = units.Quantity(data.variables['isobaric'][:].squeeze(), units('Pa'))
iso_gph = units.Quantity(data.variables['isobaric6'][:].squeeze(), units('Pa'))
u = units.Quantity(data.variables['u-component_of_wind_isobaric'][:].squeeze(), units('m/s')).to(units('knots'))
v = units.Quantity(data.variables['v-component_of_wind_isobaric'][:].squeeze(), units('m/s')).to(units('knots'))
gph = units.Quantity(data.variables['Geopotential_height_isobaric'][:].squeeze(), units('gpm'))
# Here we will select our altitudes to interpolate onto and convert them to geopotential meters
altitudes = ([30000., 9000., 3000.] * units('ft')).to(units('gpm'))
# Now we will interpolate the pressure coordinate for model output geopotential height to
# estimate the pressure level for our given altitudes at each grid point
pressures_of_alts = np.zeros((len(altitudes), len(lat), len(lon)))
for ilat in range(len(lat)):
for ilon in range(len(lon)):
pressures_of_alts[:, ilat, ilon] = log_interpolate_1d(altitudes,
gph[:, ilat, ilon],
iso_gph)
pressures_of_alts = pressures_of_alts * units('Pa')
# Similarly, we will use our interpolated pressures to interpolate
# our u and v winds across their given pressure coordinates.
# This will provide u, v at each of our interpolated pressure
# levels corresponding to our provided initial altitudes
u_at_levs = np.zeros((len(altitudes), len(lat), len(lon)))
v_at_levs = np.zeros((len(altitudes), len(lat), len(lon)))
for ilat in range(len(lat)):
for ilon in range(len(lon)):
u_at_levs[:, ilat, ilon], v_at_levs[:, ilat, ilon] = log_interpolate_1d(pressures_of_alts[:, ilat, ilon],
iso_wind,
u[:, ilat, ilon],
v[:, ilat, ilon])
u_at_levs = u_at_levs * units('knots')
v_at_levs = v_at_levs * units('knots')
# We can use mpcalc to calculate a wind speed array from these
wspd = mpcalc.wind_speed(u_at_levs, v_at_levs)
I was able to take my output from this and coerce it into your plotting code (with some unit stripping.)
Your 300-hPa GFS winds
My "30000-ft" GFS winds
Here is what my interpolated pressure fields at each estimated height level look like.
Hope this helps!
I am not sure if this is what you are looking for (I am very new to Metpy), but I have been using the metpy height_to_pressure_std(altitude) function. It puts it in units of hPa which then I convert to Pascals and then a unitless value to use in the Siphon vertical_level(float) function.
I don't think you can use metpy functions to convert height to pressure or vice versus in the upper atmosphere. There errors are too when using the Standard Atmosphere to convert say pressure to feet.

Error adding scale bar with map.scale

I manage to create a map and even include a north arrow, but can't get the map.scale to work and getting this kind of error:
Error in map.scale(x = -83, y = 12, ratio = FALSE, relwidth = 0.2, cex
= 0.6) : unused arguments (ratio = FALSE, relwidth = 0.2, cex = 0.6)
Here is the code:
library(maps)
library(mapdata)
library(ggmap)
library(mapproj)
library(maptools) #for shapefiles
library(scales) #for transparency
library(GISTools)
range <- readShapePoly("isthmanianpacificmoistforestecoregion") #layer of data for species range
map("worldHires", c('Cost', 'pan', 'Nic', 'Colombia'), xlim=c(-89,-75),ylim=c(5,13), col="lightgray", fill=TRUE) #plot the region I want
map.scale(-81,8,relwidth = 0.15, metric = TRUE, ratio = TRUE)
plot(range, add=TRUE, xlim=c(-89,-75),ylim=c(5,13), col=alpha("green", 0.6), border=TRUE)
map.scale(x=-80, y=10) #, relwidth=0.3, cex=0.5, ratio=FALSE)
north.arrow(xb=-77, yb=12, len=0.2, lab="N", col="black", fill=TRUE) #
The problem is that map.scale() is a function for both maps and GISTools packages. You are trying to use the function from the maps package. Since you are loading first maps and then GISTools, the map.scale() from maps is being masked (probably R throws a warning when loading the last package).
The solution is to specify the package in the function call:
maps::map.scale(-81,8,relwidth = 0.15, metric = TRUE, ratio = TRUE)
Also why two calls to map.scale? You should probably exclude one of them.

Resources