obspy.clients.fdsn.mass_downloader.restrictions.Restrictions

class Restrictions(starttime, endtime, station_starttime=None, station_endtime=None, chunklength_in_sec=None, network=None, station=None, location=None, channel=None, exclude_networks=(), exclude_stations=(), limit_stations_to_inventory=None, reject_channels_with_gaps=True, minimum_length=0.9, sanitize=True, minimum_interstation_distance_in_m=1000, channel_priorities=('HH[ZNE12]', 'BH[ZNE12]', 'MH[ZNE12]', 'EH[ZNE12]', 'LH[ZNE12]', 'HL[ZNE12]', 'BL[ZNE12]', 'ML[ZNE12]', 'EL[ZNE12]', 'LL[ZNE12]', 'SH[ZNE12]'), location_priorities=('', '00', '10', '01', '20', '02', '30', '03', '40', '04', '50', '05', '60', '06', '70', '07', '80', '08', '90', '09'))[source]

Bases: object

Class storing non-domain restrictions for a query. This is best explained with two examples. See the list below for a more detailed explanation of the parameters. The first set of restrictions is useful for event based earthquake set queries.

>>> import obspy
>>> restrictions = Restrictions(
...     # Get data from 5 minutes before the event to one hour after the
...     # event.
...     starttime=obspy.UTCDateTime(2012, 1, 1),
...     endtime=obspy.UTCDateTime(2012, 1, 2),
...     # You might not want to deal with gaps in the data.
...     reject_channels_with_gaps=True,
...     # And you might only want waveforms that have data for at least
...     # 95 % of the requested time span.
...     minimum_length=0.95,
...     # No two stations should be closer than 10 km to each other.
...     minimum_interstation_distance_in_m=10E3,
...     # Only HH or BH channels. If a station has HH channels,
...     # those will be downloaded, otherwise the BH. Nothing will be
...     # downloaded if it has neither.
...     channel_priorities=["HH[ZNE]", "BH[ZNE]"],
...     # Location codes are arbitrary and there is no rule as to which
...     # location is best.
...     location_priorities=["", "00", "10"])

And the restrictions for downloading a noise data set might look similar to the following:

>>> import obspy
>>> restrictions = Restrictions(
...     # Get data for a whole year.
...     starttime=obspy.UTCDateTime(2012, 1, 1),
...     endtime=obspy.UTCDateTime(2013, 1, 1),
...     # Chunk it to have one file per day.
...     chunklength_in_sec=86400,
...     # Considering the enormous amount of data associated with
...     # continuous requests, you might want to limit the data based on
...     # SEED identifiers. If the location code is specified, the
...     # location priority list is not used; the same is true for the
...     # channel argument and priority list.
...     network="BW", station="A*", location="", channel="BH*",
...     # The typical use case for such a data set are noise correlations
...     # where gaps are dealt with at a later stage.
...     reject_channels_with_gaps=False,
...     # Same is true with the minimum length. Any data during a day
...     # might be useful.
...     minimum_length=0.0,
...     # Sanitize makes sure that each MiniSEED file also has an
...     # associated StationXML file, otherwise the MiniSEED files will
...     # be deleted afterwards. This is not desirable for large noise
...     # data sets.
...     sanitize=False,
...     # Guard against the same station having different names.
...     minimum_interstation_distance_in_m=100.0)

The network, station, location, and channel codes are directly passed to the station service of each fdsn-ws implementation and can thus take comma separated string lists as arguments, i.e.

restrictions = Restrictions(
    ...
    network="BW,G?", station="A*,B*",
    ...
    )

Not all fdsn-ws implementations support the direct exclusion of network or station codes. The exclude_networks and exclude_stations arguments should thus be used for that purpose to ensure compatibility across all data providers, e.g.

restrictions = Restrictions(
    ...
    network="B*,G*", station="A*, B*",
    exclude_networks=["BW", "GR"],
    exclude_stations=["AL??", "*O"],
    ...
    )

It is also possible to restrict the downloaded stations to stations part of an existing inventory object which can originate from a StationXML file or from other sources. It will only keep stations that are part of the inventory object. Channels are still selected dynamically based on the other restrictions. Keep in mind that all other restrictions still apply - passing an inventory will just further restrict the possibly downloaded data.

restrictions = Restrictions(
    ...
    limit_stations_to_inventory=inv,
    ...
    )
Parameters:
  • starttime (UTCDateTime) – The start time of the data to be downloaded.

  • endtime (UTCDateTime) – The end time of the data.

  • station_starttime (UTCDateTime) – The start time of the station files. If not given, the starttime argument will be used. This is useful when trying to incorporate multiple waveform datasets with a central station file archive as StationXML files can be downloaded once and for the whole time span.

  • station_endtime (UTCDateTime) – The end time of the station files. Analogous to the station_starttime argument.

  • chunklength_in_sec (float) – The length of one chunk in seconds. If set, the time between starttime and endtime will be divided into segments of chunklength_in_sec seconds. Useful for continuous data requests. Set to None if one piece of data is desired between starttime and endtime (the default).

  • network (str) – The network code. Can contain wildcards.

  • station (str) – The station code. Can contain wildcards.

  • location (str) – The location code. Can contain wildcards.

  • channel (str) – The channel code. Can contain wildcards.

  • exclude_networks (list[str]) – A list of potentially wildcarded networks that should not be downloaded.

  • exclude_stations (list[str]) – A list of potentially wildcarded stations that should not be downloaded.

  • limit_stations_to_inventory (Inventory) – If given, only stations part of the this inventory object will be downloaded. All other restrictions still apply - this just serves to further limit the set of stations to download.

  • reject_channels_with_gaps (bool) – If True (default), MiniSEED files with gaps and/or overlaps will be rejected.

  • minimum_length (float) – The minimum length of the data as a fraction of the requested time frame. After a channel has been downloaded it will be checked that its total length is at least that fraction of the requested time span. Will be rejected otherwise. Must be between 0.0 and 1.0, defaults to 0.9.

  • sanitize (bool) – Sanitize makes sure that each MiniSEED file also has an associated StationXML file, otherwise the MiniSEED files will be deleted afterwards. This is potentially not desirable for large noise data sets.

  • minimum_interstation_distance_in_m (float) – The minimum inter-station distance. Data from any new station closer to any existing station will not be downloaded. Also used for duplicate station detection as sometimes stations have different names for different webservice providers. Defaults to 1000 m.

  • channel_priorities (list[str]) – Priority list for the channels. Will not be used if the channel argument is used.

  • location_priorities (list[str]) – Priority list for the locations. Will not be used if the location argument is used.

Special Methods

Restrictions.__delattr__(name, /)

Implement delattr(self, name).

Restrictions.__dir__()

Default dir() implementation.

Restrictions.__eq__(other)[source]
Restrictions.__format__(format_spec, /)

Default object formatter.

Restrictions.__ge__(value, /)

Return self>=value.

Restrictions.__getattribute__(name, /)

Return getattr(self, name).

Restrictions.__gt__(value, /)

Return self>value.

Restrictions.__init__(starttime, endtime, station_starttime=None, station_endtime=None, chunklength_in_sec=None, network=None, station=None, location=None, channel=None, exclude_networks=(), exclude_stations=(), limit_stations_to_inventory=None, reject_channels_with_gaps=True, minimum_length=0.9, sanitize=True, minimum_interstation_distance_in_m=1000, channel_priorities=('HH[ZNE12]', 'BH[ZNE12]', 'MH[ZNE12]', 'EH[ZNE12]', 'LH[ZNE12]', 'HL[ZNE12]', 'BL[ZNE12]', 'ML[ZNE12]', 'EL[ZNE12]', 'LL[ZNE12]', 'SH[ZNE12]'), location_priorities=('', '00', '10', '01', '20', '02', '30', '03', '40', '04', '50', '05', '60', '06', '70', '07', '80', '08', '90', '09'))[source]
Restrictions.__init_subclass__()

This method is called when a class is subclassed.

The default implementation does nothing. It may be overridden to extend subclasses.

Restrictions.__iter__()[source]

Iterator yielding time intervals based on the chunklength and temporal settings.

Restrictions.__le__(value, /)

Return self<=value.

Restrictions.__lt__(value, /)

Return self<value.

Restrictions.__ne__(other)[source]
Restrictions.__new__(**kwargs)
Restrictions.__reduce__()

Helper for pickle.

Restrictions.__reduce_ex__(protocol, /)

Helper for pickle.

Restrictions.__repr__()

Return repr(self).

Restrictions.__setattr__(name, value, /)

Implement setattr(self, name, value).

Restrictions.__sizeof__()

Size of object in memory, in bytes.

Restrictions.__str__()

Return str(self).

Restrictions.__subclasshook__()

Abstract classes can override this to customize issubclass().

This is invoked early on by abc.ABCMeta.__subclasscheck__(). It should return True, False or NotImplemented. If it returns NotImplemented, the normal algorithm is used. Otherwise, it overrides the normal algorithm (and the outcome is cached).