Skip to content

Commit

Permalink
Merge branch 'release/8.0.0'
Browse files Browse the repository at this point in the history
  • Loading branch information
araichev committed Oct 7, 2024
2 parents 027309a + 6aa0224 commit 9ed9013
Show file tree
Hide file tree
Showing 26 changed files with 3,289 additions and 4,385 deletions.
83 changes: 20 additions & 63 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
@@ -1,78 +1,35 @@
name: test
name: Test

on:
push:
branches: [master]
pull_request:
branches:
- master

jobs:
linting:
runs-on: ubuntu-latest
steps:
#----------------------------------------------
# check-out repo and set-up python
#----------------------------------------------
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
#----------------------------------------------
# load pip cache if cache exists
#----------------------------------------------
- uses: actions/cache@v3
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip
restore-keys: ${{ runner.os }}-pip
test:
needs: linting
runs-on: ubuntu-latest

strategy:
fail-fast: true
matrix:
os: [ "ubuntu-latest" ]
python-version: ["3.9", "3.10", "3.11"]
runs-on: ${{ matrix.os }}
python-version: ["3.10", "3.11", "3.12"]

steps:
#----------------------------------------------
# check-out repo and set-up python
#----------------------------------------------
- name: Check out repository
- name: Checkout code
uses: actions/checkout@v3
- name: Set up python ${{ matrix.python-version }}
id: setup-python

- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
#----------------------------------------------
# ----- install & configure poetry -----
#----------------------------------------------
- name: Install Poetry
uses: snok/install-poetry@v1
with:
virtualenvs-create: true
virtualenvs-in-project: true
#----------------------------------------------
# load cached venv if cache exists
#----------------------------------------------
- name: Load cached venv
id: cached-poetry-dependencies
uses: actions/cache@v3
with:
path: .venv
key: venv-${{ runner.os }}-${{ steps.setup-python.outputs.python-version }}-${{ hashFiles('**/poetry.lock') }}
#----------------------------------------------
# install dependencies if cache does not exist
#----------------------------------------------
- name: Install dependencies
if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
run: poetry lock --no-update && poetry install --no-interaction --no-root
#----------------------------------------------
# install your root project, if required
#----------------------------------------------
# - name: Install library
# run: poetry install --no-interaction
#----------------------------------------------
# add matrix specifics and run test suite
#----------------------------------------------

- name: Install UV
run: |
curl -LsSf https://astral.sh/uv/install.sh | sh
- name: Sync project environment with UV
run: |
uv sync
- name: Run tests
run: |
source .venv/bin/activate
pytest -x
uv run pytest
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,6 @@ repos:
- id: black
- repo: https://github.com/charliermarsh/ruff-pre-commit
# Ruff version.
rev: 'v0.6.1'
rev: 'v0.6.8'
hooks:
- id: ruff
10 changes: 10 additions & 0 deletions CHANGELOG.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,16 @@
Changelog
=========

8.0.0, 2024-10-08
-----------------
- Breaking change: removed the UTM library, deleted ``helpers.get_utm_crs``, and used the GeoPandas version of the function instead.
- Changed ``routes.map_routes`` to accept a list of route short names, instead of or in addition to a list of route IDs.

7.0.0, 2024-09-30
-----------------
- Switched from Poetry to UV for project management.
- Breaking change: removed ``geometrize_stops`` function and moved its functionality into ``get_stops``. Did a similar thing for ``get_shapes``, ``get_trips``, and ``get_routes``.

6.1.1, 2024-08-19
-----------------
- Changed grouped DataFrame ``feed._calendar_dates_g`` to indexed DataFrame ``feed._calendar_dates_i`` for consistency with ``feed._calendar_i`` and slight speedup in fucttion ``trips.is_active_trip``.
Expand Down
11 changes: 6 additions & 5 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,18 +2,18 @@ GTFS Kit
********
.. image:: https://github.com/mrcagney/gtfs_kit/actions/workflows/test.yml/badge.svg

GTFS Kit is a Python 3.9+ library for analyzing `General Transit Feed Specification (GTFS) <https://en.wikipedia.org/wiki/GTFS>`_ data in memory without a database.
It uses Pandas and Shapely to do the heavy lifting.
GTFS Kit is a Python library for analyzing `General Transit Feed Specification (GTFS) <https://en.wikipedia.org/wiki/GTFS>`_ data in memory without a database.
It uses Pandas and GeoPandas to do the heavy lifting.


Installation
=============
``poetry add gtfs_kit``.
Install it from PyPI with UV, say, via ``uv add gtfs_kit``.


Examples
========
You can find examples in the Jupyter notebook ``notebooks/examples.ipynb``.
In the Jupyter notebook ``notebooks/examples.ipynb``.


Authors
Expand All @@ -23,7 +23,7 @@ Authors

Documentation
=============
Documentation is built via Sphinx from the source code in the ``docs`` directory then published to Github Pages at `mrcagney.github.io/gtfs_kit_docs <https://mrcagney.github.io/gtfs_kit_docs>`_.
The documentation is built via Sphinx from the source code in the ``docs`` directory then published to Github Pages at `mrcagney.github.io/gtfs_kit_docs <https://mrcagney.github.io/gtfs_kit_docs>`_.


Notes
Expand All @@ -41,3 +41,4 @@ Notes
- GTFS time is measured relative noon minus 12 hours, which can mess things up when crossing into daylight savings time.
I don't think this issue causes any bugs in GTFS Kit, but you and i have been warned.
Thanks to user derhuerst for bringing this to my attention in `closed Issue 8 <https://github.com/mrcagney/gtfs_kit/issues/8#issue-1063633457>`_.
- I'll probably remove the GTFS validation module ``validators.py`` to avoid duplicating the work of what is now `the canonical feed validator <https://github.com/MobilityData/gtfs-validator>`_ (written in Java).
2 changes: 1 addition & 1 deletion gtfs_kit/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,4 +12,4 @@
from .feed import *


__version__ = "6.1.1"
__version__ = "8.0.0"
38 changes: 19 additions & 19 deletions gtfs_kit/feed.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,12 +16,12 @@
Ignore that extra parameter; it refers to the Feed instance,
usually called ``self`` and usually hidden automatically by Sphinx.
"""

from pathlib import Path
import tempfile
import shutil
from copy import deepcopy
import zipfile
from typing import Optional, Union

import pandas as pd
from pandas.core.frame import DataFrame
Expand Down Expand Up @@ -88,25 +88,26 @@ class Feed(object):
build_zero_route_time_series,
compute_route_time_series,
build_route_timetable,
geometrize_routes,
routes_to_geojson,
map_routes,
)
from .shapes import (
append_dist_to_shapes,
geometrize_shapes,
get_shapes,
build_geometry_by_shape,
shapes_to_geojson,
get_shapes_intersecting_geometry,
)
from .stops import (
geometrize_stops,
ungeometrize_stops,
get_stops,
compute_stop_activity,
compute_stop_stats,
build_zero_stop_time_series,
compute_stop_time_series,
build_stop_timetable,
geometrize_stops,
build_geometry_by_stop,
stops_to_geojson,
get_stops_in_area,
Expand All @@ -125,7 +126,6 @@ class Feed(object):
compute_busiest_date,
compute_trip_stats,
locate_trips,
geometrize_trips,
trips_to_geojson,
map_trips,
)
Expand Down Expand Up @@ -176,20 +176,20 @@ class Feed(object):
def __init__(
self,
dist_units: str,
agency: Optional[DataFrame] = None,
stops: Optional[DataFrame] = None,
routes: Optional[DataFrame] = None,
trips: Optional[DataFrame] = None,
stop_times: Optional[DataFrame] = None,
calendar: Optional[DataFrame] = None,
calendar_dates: Optional[DataFrame] = None,
fare_attributes: Optional[DataFrame] = None,
fare_rules: Optional[DataFrame] = None,
shapes: Optional[DataFrame] = None,
frequencies: Optional[DataFrame] = None,
transfers: Optional[DataFrame] = None,
feed_info: Optional[DataFrame] = None,
attributions: Optional[DataFrame] = None,
agency: DataFrame | None = None,
stops: DataFrame | None = None,
routes: DataFrame | None = None,
trips: DataFrame | None = None,
stop_times: DataFrame | None = None,
calendar: DataFrame | None = None,
calendar_dates: DataFrame | None = None,
fare_attributes: DataFrame | None = None,
fare_rules: DataFrame | None = None,
shapes: DataFrame | None = None,
frequencies: DataFrame | None = None,
transfers: DataFrame | None = None,
feed_info: DataFrame | None = None,
attributions: DataFrame | None = None,
):
"""
Assume that every non-None input is a DataFrame,
Expand Down Expand Up @@ -499,7 +499,7 @@ def _read_feed_from_url(url: str, dist_units: str) -> "Feed":
return feed


def read_feed(path_or_url: Union[Path, str], dist_units: str) -> "Feed":
def read_feed(path_or_url: Path | str, dist_units: str) -> "Feed":
"""
Create a Feed instance from the given path or URL and given distance units.
If the path exists, then call :func:`_read_feed_from_path`.
Expand Down
42 changes: 6 additions & 36 deletions gtfs_kit/helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

from __future__ import annotations
import datetime as dt
from typing import Optional, Union, Callable
from typing import Callable
import copy
from bisect import bisect_left, bisect_right
from functools import cmp_to_key
Expand All @@ -13,16 +13,14 @@
import pandas as pd
import numpy as np
import shapely.geometry as sg
from shapely.ops import transform
import utm
import json2html as j2h

from . import constants as cs


def datestr_to_date(
x: Union[dt.date, str], format_str: str = "%Y%m%d", *, inverse: bool = False
) -> Union[str, dt.date]:
x: dt.date | str, format_str: str = "%Y%m%d", *, inverse: bool = False
) -> str | dt.date:
"""
Given a string ``x`` representing a date in the given format,
convert it to a datetime.date object and return the result.
Expand All @@ -39,7 +37,7 @@ def datestr_to_date(


def timestr_to_seconds(
x: Union[dt.date, str], *, inverse: bool = False, mod24: bool = False
x: dt.date | str, *, inverse: bool = False, mod24: bool = False
) -> int:
"""
Given an HH:MM:SS time string ``x``, return the number of seconds
Expand Down Expand Up @@ -86,9 +84,7 @@ def timestr_mod24(timestr: str) -> int:
return result


def weekday_to_str(
weekday: Union[int, str], *, inverse: bool = False
) -> Union[int, str]:
def weekday_to_str(weekday: int | str, *, inverse: bool = False) -> int | str:
"""
Given a weekday number (integer in the range 0, 1, ..., 6),
return its corresponding weekday name as a lowercase string.
Expand All @@ -109,7 +105,7 @@ def weekday_to_str(


def get_segment_length(
linestring: sg.LineString, p: sg.Point, q: Optional[sg.Point] = None
linestring: sg.LineString, p: sg.Point, q: sg.Point | None = None
) -> float:
"""
Given a Shapely linestring and two Shapely points,
Expand Down Expand Up @@ -260,32 +256,6 @@ def is_not_null(df: pd.DataFrame, col_name: str) -> bool:
return False


def get_utm_crs(lat: float, lon: float) -> dict:
"""
Return a GeoPandas coordinate reference system (CRS) string
corresponding to the UTM projection appropriate to the given WGS84
latitude and longitude.
Code inspired by https://github.com/Turbo87/utm/issues/51.
"""
zone = utm.from_latlon(lat, lon)[2]
result = f"EPSG:326{zone:02d}" if lat >= 0 else f"EPSG:327{zone:02d}"
return result


def linestring_to_utm(linestring: sg.LineString) -> sg.LineString:
"""
Given a Shapely LineString in WGS84 coordinates,
convert it to the appropriate UTM coordinates.
If ``inverse``, then do the inverse.
"""

def proj(x, y):
return utm.from_latlon(y, x)[:2]

return transform(proj, linestring)


def get_active_trips_df(trip_times: pd.DataFrame) -> pd.Series:
"""
Count the number of trips in ``trip_times`` that are active
Expand Down
Loading

0 comments on commit 9ed9013

Please sign in to comment.