PROJ string not recognised - gdal

I've used the GDAL command window to run projinfo -s epsg:27700 -t epsg:3857 --spatial-test intersects
From the list of operations I have found the one which I want to use in my gdalwarp comand. It is as follows:
Operation No. 2:
unknown id, Inverse of British National Grid + OSGB36 to WGS 84 (6) + Popular Visualisation Pseudo-Mercator, 2 m, United Kingdom (UK) - Great Britain - England and Wales onshore, Scotland onshore and Western Isles nearshore including Sea of the Hebrides and The Minch; Isle of Man onshore.
PROJ string:
+proj=pipeline
+step +inv +proj=tmerc +lat_0=49 +lon_0=-2 +k=0.9996012717 +x_0=400000
+y_0=-100000 +ellps=airy
+step +proj=push +v_3
+step +proj=cart +ellps=airy
+step +proj=helmert +x=446.448 +y=-125.157 +z=542.06 +rx=0.15 +ry=0.247
+rz=0.842 +s=-20.489 +convention=position_vector
+step +inv +proj=cart +ellps=WGS84
+step +proj=pop +v_3
+step +proj=webmerc +lat_0=0 +lon_0=0 +x_0=0 +y_0=0 +ellps=WGS84
WKT2:2019 string:
CONCATENATEDOPERATION["Inverse of British National Grid + OSGB36 to WGS 84 (6) + Popular Visualisation Pseudo-Mercator",
SOURCECRS[
PROJCRS["OSGB36 / British National Grid",
BASEGEOGCRS["OSGB36",
DATUM["Ordnance Survey of Great Britain 1936",
ELLIPSOID["Airy 1830",6377563.396,299.3249646,
LENGTHUNIT["metre",1]]],
PRIMEM["Greenwich",0,
ANGLEUNIT["degree",0.0174532925199433]],
ID["EPSG",4277]],
CONVERSION["British National Grid",
METHOD["Transverse Mercator",
ID["EPSG",9807]],
PARAMETER["Latitude of natural origin",49,
ANGLEUNIT["degree",0.0174532925199433],
ID["EPSG",8801]],
PARAMETER["Longitude of natural origin",-2,
ANGLEUNIT["degree",0.0174532925199433],
ID["EPSG",8802]],
PARAMETER["Scale factor at natural origin",0.9996012717,
SCALEUNIT["unity",1],
ID["EPSG",8805]],
PARAMETER["False easting",400000,
LENGTHUNIT["metre",1],
ID["EPSG",8806]],
PARAMETER["False northing",-100000,
LENGTHUNIT["metre",1],
ID["EPSG",8807]]],
CS[Cartesian,2],
AXIS["(E)",east,
ORDER[1],
LENGTHUNIT["metre",1]],
AXIS["(N)",north,
ORDER[2],
LENGTHUNIT["metre",1]],
ID["EPSG",27700]]],
TARGETCRS[
PROJCRS["WGS 84 / Pseudo-Mercator",
BASEGEOGCRS["WGS 84",
ENSEMBLE["World Geodetic System 1984 ensemble",
MEMBER["World Geodetic System 1984 (Transit)"],
MEMBER["World Geodetic System 1984 (G730)"],
MEMBER["World Geodetic System 1984 (G873)"],
MEMBER["World Geodetic System 1984 (G1150)"],
MEMBER["World Geodetic System 1984 (G1674)"],
MEMBER["World Geodetic System 1984 (G1762)"],
MEMBER["World Geodetic System 1984 (G2139)"],
ELLIPSOID["WGS 84",6378137,298.257223563,
LENGTHUNIT["metre",1]],
ENSEMBLEACCURACY[2.0]],
PRIMEM["Greenwich",0,
ANGLEUNIT["degree",0.0174532925199433]],
ID["EPSG",4326]],
CONVERSION["Popular Visualisation Pseudo-Mercator",
METHOD["Popular Visualisation Pseudo Mercator",
ID["EPSG",1024]],
PARAMETER["Latitude of natural origin",0,
ANGLEUNIT["degree",0.0174532925199433],
ID["EPSG",8801]],
PARAMETER["Longitude of natural origin",0,
ANGLEUNIT["degree",0.0174532925199433],
ID["EPSG",8802]],
PARAMETER["False easting",0,
LENGTHUNIT["metre",1],
ID["EPSG",8806]],
PARAMETER["False northing",0,
LENGTHUNIT["metre",1],
ID["EPSG",8807]]],
CS[Cartesian,2],
AXIS["easting (X)",east,
ORDER[1],
LENGTHUNIT["metre",1]],
AXIS["northing (Y)",north,
ORDER[2],
LENGTHUNIT["metre",1]],
ID["EPSG",3857]]],
STEP[
CONVERSION["Inverse of British National Grid",
METHOD["Inverse of Transverse Mercator",
ID["INVERSE(EPSG)",9807]],
PARAMETER["Latitude of natural origin",49,
ANGLEUNIT["degree",0.0174532925199433],
ID["EPSG",8801]],
PARAMETER["Longitude of natural origin",-2,
ANGLEUNIT["degree",0.0174532925199433],
ID["EPSG",8802]],
PARAMETER["Scale factor at natural origin",0.9996012717,
SCALEUNIT["unity",1],
ID["EPSG",8805]],
PARAMETER["False easting",400000,
LENGTHUNIT["metre",1],
ID["EPSG",8806]],
PARAMETER["False northing",-100000,
LENGTHUNIT["metre",1],
ID["EPSG",8807]],
ID["INVERSE(EPSG)",19916]]],
STEP[
COORDINATEOPERATION["OSGB36 to WGS 84 (6)",
VERSION["UKOOA-Pet"],
SOURCECRS[
GEOGCRS["OSGB36",
DATUM["Ordnance Survey of Great Britain 1936",
ELLIPSOID["Airy 1830",6377563.396,299.3249646,
LENGTHUNIT["metre",1]]],
PRIMEM["Greenwich",0,
ANGLEUNIT["degree",0.0174532925199433]],
CS[ellipsoidal,2],
AXIS["geodetic latitude (Lat)",north,
ORDER[1],
ANGLEUNIT["degree",0.0174532925199433]],
AXIS["geodetic longitude (Lon)",east,
ORDER[2],
ANGLEUNIT["degree",0.0174532925199433]],
ID["EPSG",4277]]],
TARGETCRS[
GEOGCRS["WGS 84",
ENSEMBLE["World Geodetic System 1984 ensemble",
MEMBER["World Geodetic System 1984 (Transit)"],
MEMBER["World Geodetic System 1984 (G730)"],
MEMBER["World Geodetic System 1984 (G873)"],
MEMBER["World Geodetic System 1984 (G1150)"],
MEMBER["World Geodetic System 1984 (G1674)"],
MEMBER["World Geodetic System 1984 (G1762)"],
MEMBER["World Geodetic System 1984 (G2139)"],
ELLIPSOID["WGS 84",6378137,298.257223563,
LENGTHUNIT["metre",1]],
ENSEMBLEACCURACY[2.0]],
PRIMEM["Greenwich",0,
ANGLEUNIT["degree",0.0174532925199433]],
CS[ellipsoidal,2],
AXIS["geodetic latitude (Lat)",north,
ORDER[1],
ANGLEUNIT["degree",0.0174532925199433]],
AXIS["geodetic longitude (Lon)",east,
ORDER[2],
ANGLEUNIT["degree",0.0174532925199433]],
ID["EPSG",4326]]],
METHOD["Position Vector transformation (geog2D domain)",
ID["EPSG",9606]],
PARAMETER["X-axis translation",446.448,
LENGTHUNIT["metre",1],
ID["EPSG",8605]],
PARAMETER["Y-axis translation",-125.157,
LENGTHUNIT["metre",1],
ID["EPSG",8606]],
PARAMETER["Z-axis translation",542.06,
LENGTHUNIT["metre",1],
ID["EPSG",8607]],
PARAMETER["X-axis rotation",0.15,
ANGLEUNIT["arc-second",4.84813681109536E-06],
ID["EPSG",8608]],
PARAMETER["Y-axis rotation",0.247,
ANGLEUNIT["arc-second",4.84813681109536E-06],
ID["EPSG",8609]],
PARAMETER["Z-axis rotation",0.842,
ANGLEUNIT["arc-second",4.84813681109536E-06],
ID["EPSG",8610]],
PARAMETER["Scale difference",-20.489,
SCALEUNIT["parts per million",1E-06],
ID["EPSG",8611]],
OPERATIONACCURACY[2.0],
ID["EPSG",1314],
REMARK["Commonly referred to as the ""OSGB Petroleum transformation"". For a more accurate transformation see ETRS89 to OSGB36 / British National Grid (3) (code 7953)."]]],
STEP[
CONVERSION["Popular Visualisation Pseudo-Mercator",
METHOD["Popular Visualisation Pseudo Mercator",
ID["EPSG",1024]],
PARAMETER["Latitude of natural origin",0,
ANGLEUNIT["degree",0.0174532925199433],
ID["EPSG",8801]],
PARAMETER["Longitude of natural origin",0,
ANGLEUNIT["degree",0.0174532925199433],
ID["EPSG",8802]],
PARAMETER["False easting",0,
LENGTHUNIT["metre",1],
ID["EPSG",8806]],
PARAMETER["False northing",0,
LENGTHUNIT["metre",1],
ID["EPSG",8807]],
ID["EPSG",3856]]],
USAGE[
SCOPE["unknown"],
AREA["United Kingdom (UK) - Great Britain - England and Wales onshore, Scotland onshore and Western Isles nearshore including Sea of the Hebrides and The Minch; Isle of Man onshore."],
BBOX[49.797,-8.818,60.935,1.92]]]
However, when I copy the PROJ string as the t_srs I am getting an error PROJ: proj_create: unrecognized format / unknown name
I'm not quite sure how to pass the PROJ string in correctly as I am new to it. I know I can also pass it in as WKT but that seems like an awful lot of text to pass in.
Is there some method to refining the information and passing it in correctly?
It is specifically this transformation method I want, so I need to ensure I pass it through correctly

Related

Pop the first element in a pandas column

I have a pandas column like below:
import pandas as pd
data = {'id': ['001', '002', '003'],
'address': [['William J. Clare', '290 Valley Dr.', 'Casper, WY 82604','USA, United States'],
['1180 Shelard Tower', 'Minneapolis, MN 55426', 'USA, United States'],
['William N. Barnard', '145 S. Durbin', 'Casper, WY 82601', 'USA, United States']]
}
df = pd.DataFrame(data)
I wanted to pop the 1st element in the address column list if its name or if it doesn't contain any number.
output:
[['290 Valley Dr.', 'Casper, WY 82604','USA, United States'], ['1180 Shelard Tower', 'Minneapolis, MN 55426', 'USA, United States'], ['145 S. Durbin', 'Casper, WY 82601', 'USA, United States']]
This is continuation of my previous post. I am learning python and this is my 2nd project and I am struggling with this from morning please help me.
Assuming you define an address as a string starting with a number (you can change the logic):
for l in df['address']:
if not l[0][0].isdigit():
l.pop(0)
print(df)
updated df:
id address
0 001 [290 Valley Dr., Casper, WY 82604, USA, United...
1 002 [1180 Shelard Tower, Minneapolis, MN 55426, US...
2 003 [145 S. Durbin, Casper, WY 82601, USA, United ...

Aggregate data based on values appearing in two columns interchangeably?

home_team_name away_team_name home_ppg_per_odds_pre_game away_ppg_per_odds_pre_game
0 Manchester United Tottenham Hotspur 3.310000 4.840000
1 AFC Bournemouth Aston Villa 0.666667 3.230000
2 Norwich City Crystal Palace 0.666667 13.820000
3 Leicester City Sunderland 4.733333 3.330000
4 Everton Watford 0.583333 2.386667
5 Chelsea Manchester United 1.890000 3.330000
The home_ppg_per_odds_pre_game and away_ppg_per_odds_pre_game are basically the same metric. The former reprsents the value of this metric for the home_team, while the latter represents this metric for the away team. I want a mean of this metric for each team and that is regardless whether the team is playing home or away. In the example df you Manchester United as home_team_name in zero and as away_team_name in 5. I want the mean for Manchester United that includes all this examples.
df.groupby("home_team_name")["home_ppg_per_odds_pre_game"].mean()
This will only bring me the mean for the occasion when the team is playing home, but I want both home and away.
Since the two metrics are the same, you can append the home and away team metrics, like this:
data_df = pd.concat([df.loc[:,('home_team_name','home_ppg_per_odds_pre_game')], df.loc[:,('away_team_name','away_ppg_per_odds_pre_game')].rename(columns={'away_team_name':'home_team_name','away_ppg_per_odds_pre_game':'home_ppg_per_odds_pre_game'})])
Then you can use groupby to get the means:
data_df.groupby('home_team_name')['home_ppg_per_odds_pre_game'].mean().reset_index()

How to convert image coordinates to UTM coordinates in Python

I labeled a part of a GeoTiff image with polygon using an annotation software which output is X,Y image coordinates (including sub-pixel ones) for each point in the polygon in XML format.
The question is how to convert these points to UTM coordinates in Python in GeoJson format.
From the GeoTiff image, I can extract the following information:
gdalinfo -mm test-area.tif
Driver: GTiff/GeoTIFF
Files: test-area.tif
Size is 1356, 1351
Coordinate System is:
PROJCRS["WGS 84 / UTM zone 11N",
BASEGEOGCRS["WGS 84",
DATUM["World Geodetic System 1984",
ELLIPSOID["WGS 84",6378137,298.257223563,
LENGTHUNIT["metre",1]]],
PRIMEM["Greenwich",0,
ANGLEUNIT["degree",0.0174532925199433]],
ID["EPSG",4326]],
CONVERSION["UTM zone 11N",
METHOD["Transverse Mercator",
ID["EPSG",9807]],
PARAMETER["Latitude of natural origin",0,
ANGLEUNIT["degree",0.0174532925199433],
ID["EPSG",8801]],
PARAMETER["Longitude of natural origin",-117,
ANGLEUNIT["degree",0.0174532925199433],
ID["EPSG",8802]],
PARAMETER["Scale factor at natural origin",0.9996,
SCALEUNIT["unity",1],
ID["EPSG",8805]],
PARAMETER["False easting",500000,
LENGTHUNIT["metre",1],
ID["EPSG",8806]],
PARAMETER["False northing",0,
LENGTHUNIT["metre",1],
ID["EPSG",8807]]],
CS[Cartesian,2],
AXIS["(E)",east,
ORDER[1],
LENGTHUNIT["metre",1]],
AXIS["(N)",north,
ORDER[2],
LENGTHUNIT["metre",1]],
USAGE[
SCOPE["unknown"],
AREA["World - N hemisphere - 120°W to 114°W - by country"],
BBOX[0,-120,84,-114]],
ID["EPSG",32611]]
Data axis to CRS axis mapping: 1,2
Origin = (432390.000000000000000,3727776.000000000000000)
Pixel Size = (3.000000000000000,-3.000000000000000)
Metadata:
AREA_OR_POINT=Area
Image Structure Metadata:
INTERLEAVE=PIXEL
Corner Coordinates:
Upper Left ( 432390.000, 3727776.000) (117d43'46.05"W, 33d41'15.97"N)
Lower Left ( 432390.000, 3723723.000) (117d43'44.94"W, 33d39' 4.38"N)
Upper Right ( 436458.000, 3727776.000) (117d41' 8.05"W, 33d41'16.87"N)
Lower Right ( 436458.000, 3723723.000) (117d41' 7.01"W, 33d39' 5.28"N)
Center ( 434424.000, 3725749.500) (117d42'26.51"W, 33d40'10.63"N)
Band 1 Block=1356x1 Type=Int16, ColorInterp=Gray
Computed Min/Max=185.000,4470.000
NoData Value=32767
Band 2 Block=1356x1 Type=Int16, ColorInterp=Undefined
Computed Min/Max=299.000,4895.000
NoData Value=32767
Band 3 Block=1356x1 Type=Int16, ColorInterp=Undefined
Computed Min/Max=276.000,5419.000
NoData Value=32767
Band 4 Block=1356x1 Type=Int16, ColorInterp=Undefined
Computed Min/Max=659.000,5466.000
NoData Value=32767

perigee and apogee calculations off by a few minutes

I'm trying to calculate perigee and apogee (or apsis in general given a second body such as the Sun, and planet, etc)
from skyfield import api, almanac
from scipy.signal import argrelextrema
import numpy as np
e = api.load('de430t.bsp')
def apsis(year = 2019, body='moon'):
apogees = dict()
perigees = dict()
planets = e
earth, moon = planets['earth'], planets[body]
t = ts.utc(year, 1, range(1,367))
dt = t.utc_datetime()
astrometric = earth.at(t).observe(moon)
_, _, distance = astrometric.radec()
#find perigees, at day precision
localmaxes = argrelextrema(distance.km, np.less)[0]
for i in localmaxes:
# get minute precision
t2 = ts.utc(dt[i].year, dt[i].month, dt[i].day-1, 0, range(2881))
dt2 = t2.utc_datetime() # _and_leap_second()
astrometric2 = earth.at(t2).observe(moon)
_, _, distance2 = astrometric2.radec()
m = min(distance2.km)
daindex = list(distance2.km).index(m)
perigees[dt2[daindex]] = m
#find apogees, at day precision
localmaxes = argrelextrema(distance.km, np.greater)[0]
for i in localmaxes:
# get minute precision
t2 = ts.utc(dt[i].year, dt[i].month, dt[i].day-1, 0, range(2881))
dt2 = t2.utc_datetime()
astrometric2 = earth.at(t2).observe(moon)
_, _, distance2 = astrometric2.radec()
m = max(distance2.km)
daindex = list(distance2.km).index(m)
apogees[dt2[daindex]] = m
return apogees, perigee
When I run this for 2019, the next apogee calculates out at 2019-09-13 13:16. This differs by a few minutes from tables such as John Walker's (13:33), Fred Espenak's (13:32), Time and Date dot com (13:32).
I'd expect difference of a minute as seen above in the other sources for reasons such as rounding vs truncation of seconds, but more than 15 minutes difference seems unusual. I've tried this with de431t and de421 ephemeris with similar results.
Whats the difference here? I'm calculating distance of the center of each body, right? What am I screwing up?
After a bit more research and comparing skyfield output to the output of JPL's Horizons, it appears that Skyfield is correct in its calculations, at least against the JPL ephemeris (not surprise there)
I switched the above code snippet to use the same (massive) de432t SPICE kernel used by HORIZONS. This lines up with HORIZONS output (see below, apogees reported by various sources marked), the Moon begins moving away (deldot or range-rate between the observer (geocentric Earth) and the target body (geocentric Moon) goes negative
Ephemeris / WWW_USER Fri Sep 13 17:05:39 2019 Pasadena, USA / Horizons
*******************************************************************************
Target body name: Moon (301) {source: DE431mx}
Center body name: Earth (399) {source: DE431mx}
Center-site name: GEOCENTRIC
*******************************************************************************
Start time : A.D. 2019-Sep-13 13:10:00.0000 UT
Stop time : A.D. 2019-Sep-13 13:35:00.0000 UT
Step-size : 1 minutes
*******************************************************************************
Target pole/equ : IAU_MOON {East-longitude positive}
Target radii : 1737.4 x 1737.4 x 1737.4 km {Equator, meridian, pole}
Center geodetic : 0.00000000,0.00000000,0.0000000 {E-lon(deg),Lat(deg),Alt(km)}
Center cylindric: 0.00000000,0.00000000,0.0000000 {E-lon(deg),Dxy(km),Dz(km)}
Center pole/equ : High-precision EOP model {East-longitude positive}
Center radii : 6378.1 x 6378.1 x 6356.8 km {Equator, meridian, pole}
Target primary : Earth
Vis. interferer : MOON (R_eq= 1737.400) km {source: DE431mx}
Rel. light bend : Sun, EARTH {source: DE431mx}
Rel. lght bnd GM: 1.3271E+11, 3.9860E+05 km^3/s^2
Atmos refraction: NO (AIRLESS)
RA format : HMS
Time format : CAL
EOP file : eop.190912.p191204
EOP coverage : DATA-BASED 1962-JAN-20 TO 2019-SEP-12. PREDICTS-> 2019-DEC-03
Units conversion: 1 au= 149597870.700 km, c= 299792.458 km/s, 1 day= 86400.0 s
Table cut-offs 1: Elevation (-90.0deg=NO ),Airmass (>38.000=NO), Daylight (NO )
Table cut-offs 2: Solar elongation ( 0.0,180.0=NO ),Local Hour Angle( 0.0=NO )
Table cut-offs 3: RA/DEC angular rate ( 0.0=NO )
*******************************************************************************
Date__(UT)__HR:MN delta deldot
***************************************************
$$SOE
2019-Sep-13 13:10 0.00271650099697 0.0000340
2019-Sep-13 13:11 0.00271650100952 0.0000286
2019-Sep-13 13:12 0.00271650101990 0.0000232
2019-Sep-13 13:13 0.00271650102812 0.0000178
2019-Sep-13 13:14 0.00271650103417 0.0000124
2019-Sep-13 13:15 0.00271650103805 0.0000070
2019-Sep-13 13:16 0.00271650103977 0.0000016 <----- Skyfield, HORIZONS
2019-Sep-13 13:17 0.00271650103932 -0.0000038
2019-Sep-13 13:18 0.00271650103670 -0.0000092
2019-Sep-13 13:19 0.00271650103191 -0.0000146
2019-Sep-13 13:20 0.00271650102496 -0.0000200
2019-Sep-13 13:21 0.00271650101585 -0.0000254
2019-Sep-13 13:22 0.00271650100456 -0.0000308
2019-Sep-13 13:23 0.00271650099112 -0.0000362
2019-Sep-13 13:24 0.00271650097550 -0.0000416
2019-Sep-13 13:25 0.00271650095772 -0.0000470
2019-Sep-13 13:26 0.00271650093778 -0.0000524
2019-Sep-13 13:27 0.00271650091566 -0.0000578
2019-Sep-13 13:28 0.00271650089139 -0.0000632
2019-Sep-13 13:29 0.00271650086494 -0.0000686
2019-Sep-13 13:30 0.00271650083633 -0.0000740
2019-Sep-13 13:31 0.00271650080556 -0.0000794
2019-Sep-13 13:32 0.00271650077262 -0.0000848 <------ Espenak, T&D.com
2019-Sep-13 13:33 0.00271650073751 -0.0000902
2019-Sep-13 13:34 0.00271650070024 -0.0000956
2019-Sep-13 13:35 0.00271650066081 -0.0001010
$$EOE
Looking at Espenak's page a bit more, his calculations are based on Jean Meeus' Astronomical Algorithms book (a must have for anyone who plays with this stuff). Lunar ephemeris in that book comes from Jean Chapront's ELP2000/82. While this has been fitted into DE430 (among others),
Sure enough, when using that ELP2000 model to find the maximum lunar distance today Sept 13 2019. You get 2019-09-13 13:34. See code below.
Meeus based his formulae on the 1982 version of Ephemeride Lunaire Parisienne and the source code below leverages the 2002 update by Chapront, but is pretty much what those other sources are coming up with.
So I think my answer is, they are different answers because they are using different models. Skyfield is leveraging the models represented as numerical integrations by the JPL Development ephemeris while ELP is a more analytical approach.
In the end I realize it's a nit-pick, I just wanted to better understand the tools I'm using. But it begs the question, which approach is more accurate?
From what I've read, DE430 and its isotopes, have been fit to observational data, namely Lunar Laser Ranging (LLR) measurement. If just for that LLR consideration, I think I'll stick with Skyfield for calculating lunar distance.
from elp_mpp02 import mpp02 as mpp
import julian
import pytz
import datetime
def main():
mpp.dataDir = 'ELPmpp02'
mode = 1 # Historical mode
jd = 2451545
data = dict()
maxdist = 0
apogee = None
for x in range(10,41):
dt = datetime.datetime(2019, 9, 13, 13, x, tzinfo=pytz.timezone("UTC"))
jd = julian.to_jd(dt, fmt='jd')
lon, lat, dist = mpp.compute_lbr(jd, mode)
if dist > maxdist:
maxdist = dist
apogee = dt
print(f"{maxdist:.2} {apogee}")

How do I create a data frame after using groupby.tail

I want to create a plot on terrorist groups within my data, I have grouped and used.tail however I cannot plot the data
-<bound method Series.count of terrorist_group
-Revolutionary Armed Forces of Colombia (FARC) -105
-Al-Qaida in the Arabian Peninsula (AQAP) -142
-Tehrik-i-Taliban Pakistan (TTP) -157
-Maoists -165
-New People's Army (NPA) -210
-Boko Haram -234
-Al-Shabaab -325
-Islamic State of Iraq and the Levant (ISIL) -374
-Taliban -775
-Unknown -7973
-dtype: int64>
terrorgroup=year2013.groupby("terrorist_group").size().sort_values(inplace=False)
gtf=terrorgroup.tail(10).count
gtf