Current Antarctic large iceberg positions derived from ASCAT and OSCAT-2

Overview

Iceberg Locations

The iceberg data collection

Antarctic large iceberg positions derived from ASCAT and OSCAT-2. All data collected here are from the NASA SCP website

Overview

This a project which automatically scrapes data from https://www.scp.byu.edu/current_icebergs.html to get the current location of all the large iceberg in the Antarctic, The position is derived from ASCAT and OSCAT-2. The json iceberg_location.json contains all the information collected from the page. This JSON is typically updated once or twice a week(as per the updates on the website), typically on Mondays and possibly Fridays. Positions reported here are extracted from near real-time ASCAT and OSCAT-2 data in tandem. Positions reported in the full iceberg database are generated from science data and have been more accurately tracked. The full database is updated only a few times per year which can be accessed from https://www.scp.byu.edu/data/iceberg/database1.html.

The scheduled task of scraping the website runs every day.

Requirements

  1. Python (3.6, 3.7, 3.8, 3.9)
  2. beautifulsoup4 (4.9.3)
  3. lxml (4.6.2)

Get started

Step 1: Install requirements

pip install -r requirements.txt

Step 2: Make migration for the models

python iceberg.py

Step 3: Go to the link iceberg_location.json to see the latest position details of the iceberg.

JSON Schema

The file iceberg_location.json is structured in the format

{
    "$schema": "http://json-schema.org/draft-06/schema#",
    "type": "object",
    "additionalProperties": {
        "type": "array",
        "items": {
            "$ref": "#/definitions/ScriptElement"
        }
    },
    "definitions": {
        "ScriptElement": {
            "type": "object",
            "additionalProperties": false,
            "properties": {
                "iceberg": {
                    "type": "string"
                },
                "recent_observation": {
                    "type": "string"
                },
                "longitude": {
                    "type": "integer"
                },
                "dms_longitude": {
                    "type": "string"
                },
                "dms_lattitude": {
                    "type": "string"
                },
                "lattitude": {
                    "type": "integer"
                }
            },
            "required": [
                "dms_lattitude",
                "dms_longitude",
                "iceberg",
                "lattitude",
                "longitude",
                "recent_observation"
            ],
            "title": "ScriptElement"
        }
    }
}

Example

{
    "02/12/21": [
        {
            "iceberg": "a23a",
            "recent_observation": "02/09/21",
            "longitude": -400.0,
            "dms_longitude": "40 0'W",
            "dms_lattitude": "75 45'S",
            "lattitude": -7545.0
        },
        {
            "iceberg": "a63",
            "recent_observation": "02/09/21",
            "longitude": -5447.0,
            "dms_longitude": "54 47'W",
            "dms_lattitude": "71 41'S",
            "lattitude": -7141.0
        },
        {
            "iceberg": "a64",
            "recent_observation": "02/09/21",
            "longitude": -6038.0,
            "dms_longitude": "60 38'W",
            "dms_lattitude": "69 23'S",
            "lattitude": -6923.0
        },
        ................

OSCAT-2 - Operational users please note: This list cannot possibly contain all potentially hazardous icebergs in Antarctic waters -- Scatterometers such as ASCAT and OSCAT-2 were designed for measuring ocean winds, not icebergs. Scatterometer data is useful for tracking icebergs but is limited. During the Austral summer contrast between ocean and melting icebergs is reduced, which can result in gaps invisibility. Further, as the SCP team is not an operational agency, errors are expected and we cannot be held responsible for omissions or errors in this database. Also, note that the large icebergs tracked here tend to shed smaller iceberg fragments which are serious navigation hazards in nearby areas. Fragments (large and small) can drift substantial distances from their origins.

Instagram_scrapper - This project allow you to scrape the list of followers, following or both from a public Instagram account, and create a csv or excel file easily.

Instagram_scrapper This project allow you to scrape the list of followers, following or both from a public Instagram account, and create a csv or exce

Lakhdar Belkharroubi 5 Oct 17, 2022
Scraping Thailand COVID-19 data from the DDC's tableau dashboard

Scraping COVID-19 data from DDC Dashboard Scraping Thailand COVID-19 data from the DDC's tableau dashboard. Data is updated at 07:30 and 08:00 daily.

Noppakorn Jiravaranun 5 Jan 04, 2022
This code will be able to scrape movies from a movie website and also provide download links to newly uploaded movies.

Movies-Scraper You are probably tired of navigating through a movie website to get the right movie you'd want to watch during the weekend. There may e

1 Jan 31, 2022
A simple app to scrap data from Twitter.

Twitter-Scraping-App A simple app to scrap data from Twitter. Available Features Search query. Select number of data you want to fetch from twitter. C

Davis David 2 Oct 31, 2022
12306抢票脚本

12306抢票脚本

罐子里的茶 457 Jan 05, 2023
Amazon web scraping using Scrapy Framework

Amazon-web-scraping-using-Scrapy-Framework Scrapy Scrapy is an application framework for crawling web sites and extracting structured data which can b

Sejal Rajput 1 Jan 25, 2022
Automated data scraper for Thailand COVID-19 data

The Researcher COVID data Automated data scraper for Thailand COVID-19 data Accessing the Data 1st Dose Provincial Vaccination Data 2nd Dose Provincia

Porames Vatanaprasan 31 Apr 17, 2022
原神爬虫 抓取原神界面圣遗物信息

原神圣遗物半自动爬虫 说明 直接抓取原神界面中的圣遗物数据 目前只适配了背包页面的抓取 准确率:97.5%(普通通用接口,对 40 件随机圣遗物识别,统计完全正确的数量为 39) 准确率:100%(4k 屏幕,普通通用接口,对 110 件圣遗物识别,统计完全正确的数量为 110) 不排除还有小错误的

hwa 28 Oct 10, 2022
PS5 bot to find a console in france for chrismas 🎄🎅🏻 NOT FOR SCALPERS

Une PS5 pour Noël Python + Chrome --headless = une PS5 pour noël MacOS Installer chrome Tweaker le .yaml pour la listes sites a scrap et les criteres

Olivier Giniaux 3 Feb 13, 2022
Scrapegoat is a python library that can be used to scrape the websites from internet based on the relevance of the given topic irrespective of language using Natural Language Processing

Scrapegoat is a python library that can be used to scrape the websites from internet based on the relevance of the given topic irrespective of language using Natural Language Processing. It can be ma

10 Jul 06, 2022
UsernameScraperTool - Username Scraper Tool With Python

UsernameScraperTool Username Scraper for 40+ Social sites. How To use git clone

E4crypt3d 1 Dec 20, 2022
Scrapes the Sun Life of Canada Philippines web site for historical prices of their investment funds and then saves them as CSV files.

slocpi-scraper Sun Life of Canada Philippines Inc Investment Funds Scraper Install dependencies pip install -r requirements.txt Usage General format:

Daryl Yu 2 Jan 07, 2022
Distributed Crawler Management Framework Based on Scrapy, Scrapyd, Django and Vue.js

Gerapy Distributed Crawler Management Framework Based on Scrapy, Scrapyd, Scrapyd-Client, Scrapyd-API, Django and Vue.js. Documentation Documentation

Gerapy 2.9k Jan 03, 2023
LSpider 一个为被动扫描器定制的前端爬虫

LSpider LSpider - 一个为被动扫描器定制的前端爬虫 什么是LSpider? 一款为被动扫描器而生的前端爬虫~ 由Chrome Headless、LSpider主控、Mysql数据库、RabbitMQ、被动扫描器5部分组合而成。

Knownsec, Inc. 321 Dec 12, 2022
京东茅台抢购 2021年4月最新版

Jd_Seckill 特别声明: 本仓库发布的jd_seckill项目中涉及的任何脚本,仅用于测试和学习研究,禁止用于商业用途,不能保证其合法性,准确性,完整性和有效性,请根据情况自行判断。 本项目内所有资源文件,禁止任何公众号、自媒体进行任何形式的转载、发布。 huanghyw 对任何脚本问题概不

45 Dec 14, 2022
Scraping weather data using Python to receive umbrella reminders

A Python package which scrapes weather data from google and sends umbrella reminders to specified email at specified time daily.

Edula Vinay Kumar Reddy 1 Aug 23, 2022
Screenhook is a script that captures an image of a web page and send it to a discord webhook.

screenshot from the web for discord webhooks screenhook is a script that captures an image of a web page and send it to a discord webhook.

Toast Energy 3 Jun 04, 2022
A web service for scanning media hosted by a Matrix media repository

Matrix Content Scanner A web service for scanning media hosted by a Matrix media repository Installation TODO Development In a virtual environment wit

Brendan Abolivier 5 Dec 01, 2022
Web-Scraping using Selenium Master

Web-Scraping using Selenium What is the need of Selenium? Some websites don't like to be scrapped and in that case you need to disguise your webscrapi

Md Rashidul Islam 1 Oct 26, 2021
Examine.com supplement research scraper!

ExamineScraper Examine.com supplement research scraper! Why I want to be able to search pages for a specific term. For example, I want to be able to s

Tyler 15 Dec 06, 2022