blob: 537e0565bf7e7ac87f25d5678004fd580097e7b2 (
plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
|
Want to download all ELS instagram photos, with at least date metadata, then
re-upload to wiki.mako.cc (or some other location).
## Setup
You will need bot credentials for `wiki.mako.cc`; you can get these from a
special "bot password" page. The bot needs to be able to create images and
pages. Copy `./example.env` to `.env` and edit the lines for username and
password; put the bot info in, not your regular account info.
Using `pipenv` to generate a python virtualenv:
pipenv shell
pip install instaloader requests
First time you run the instaloader download, it is like:
instaloader profile --no-compress-json extraordinaryleastsquares
then for updates:
instaloader --fast-update --no-compress-json extraordinaryleastsquares
## Re-Upload All
pipenv run ./update.sh
ls extraordinaryleastsquares/2020-0*.json | parallel -j1 ./reupload.py {}
## Experimentation
Using `pipenv` to generate a python virtualenv:
pipenv shell
pip install instaloader requests
Then:
instaloader profile --no-compress-json extraordinaryleastsquares
# "Warning: Use --login to download HD version of profile pictures"
In the future:
instaloader --fast-update --no-compress-json extraordinaryleastsquares
For mediawiki API, docs describe using requests: <https://www.mediawiki.org/wiki/API:Images#Python>
Metadata for items:
filename: CEQD_<date>_<time>_<imagenum>.jpg
summary:
<instagram_comment>
Imported from ELS instagram: https://www.instagram.com/p/<short_id>/
page:
[[Category:Center for Extraordinary Quarantine Dining]]
== Summary ==
veggie #katsudon from a couple nights ago, galangal #broccoli on the side
Imported from ELS instagram: https://www.instagram.com/p/B9r_0Fsl1eh/
|