aboutsummaryrefslogtreecommitdiffstats

Want to download all ELS instagram photos, with at least date metadata, then re-upload to wiki.mako.cc (or some other location).

Setup

You will need bot credentials for wiki.mako.cc; you can get these from a special "bot password" page. The bot needs to be able to create images and pages. Copy ./example.env to .env and edit the lines for username and password; put the bot info in, not your regular account info.

Using pipenv to generate a python virtualenv:

pipenv shell
pip install instaloader requests

First time you run the instaloader download, it is like:

instaloader profile --no-compress-json extraordinaryleastsquares

then for updates:

instaloader --fast-update --no-compress-json extraordinaryleastsquares

Re-Upload All

pipenv run ./update.sh

ls extraordinaryleastsquares/2020-0*.json | parallel -j1 ./reupload.py {}

Experimentation

Using pipenv to generate a python virtualenv:

pipenv shell
pip install instaloader requests

Then:

instaloader profile --no-compress-json extraordinaryleastsquares
# "Warning: Use --login to download HD version of profile pictures"

In the future:

instaloader --fast-update --no-compress-json extraordinaryleastsquares

For mediawiki API, docs describe using requests: https://www.mediawiki.org/wiki/API:Images#Python

Metadata for items:

filename: CEQD_<date>_<time>_<imagenum>.jpg
summary:
    <instagram_comment>
    Imported from ELS instagram: https://www.instagram.com/p/<short_id>/

page:
    [[Category:Center for Extraordinary Quarantine Dining]]

    == Summary ==
    veggie #katsudon from a couple nights ago, galangal #broccoli on the side

    Imported from ELS instagram: https://www.instagram.com/p/B9r_0Fsl1eh/