- emailed re**@psu.edu[^got reply 28 dec 2024]
- emailed rod**.historiador@gmail.com[^no reply as of 22 jan 2025]
- no calendar work in
historical-series-re-Belize/work/apr-24 - none in
/oct-23 -
/versioning/calendar.tsvnot filled in - check
sheets varids.tsvfor earliest file w/ calendar - cartas first pop up in varid 70 ie in 10-30 sep 2021
- check
pre-git-hist/events.tsvfor sep 2021 - cartas in
files/red/rep bz fin pub 1.3_stable *.tsvwritten 7 sep 2021[^seems like a stable ``.numbers`` to ``.tsv`` export so prolly had cartas by 7 sep] - so cartas first pop up in varid 75 ie by 7 sep[^in `sheets/rep bz fin pub 1.3_stable.tsv`]
- check events again for post-
v-1.4work[^cartas next touched in `pre-git-hist/files/purp/1638 series *.tsv` on 9 jul 2022, then `purp/PARES-*.txt`, then `purp/1638 cartas *.tsv`, then `yell/data *.tsv` on 27 jul 2022 w last recorded mod on 3 aug 2022] - check work in
work/*-22 - check work in
work/*-23 - get latest data version in
txtforgit[^got in `tsv` and `json`] - split
list.jsonfor easier handling?[^yes with ``for i in `seq 0 5239`; do jq ".[$i]" list.json > "c$i.json"; done``, plus remove `cartas/list.*` to keep only object files] - fix
codvalues[^must be `true` if any `data` value is not `null` - done but not rechecked] -
marktruefor null values ofcodfor cartas de Guat/Mex w/in 15 yrs of 1630[^abort at 1620 as can just navigate by `lbd` value] - add
refvalues tocatfor easier navigation[^via eg `jq '.cat += {"ref" : "GUATEMALA,39,R.14,N.93"}' c0.json > tmp && mv tmp c0.json`] - coded 1638 dated records[^checked for thoroughness but not accuracy too - all such files now have non-null `cod` and data in `data` if appropriate, but pre-existing data in `data` not reviewed ie only `false cod` should have null `data`, but `true cod` may have inaccurate or incomplete pre-existing `data`]
-
code cartas de gobernadores de Hon/Yuc first? -
coded 1639 records - further ensure each 1638 record is calendared w/ summary/extract, either via
cator added indata[^except records w/c seem irrelevant from `cat` eg geographically irrelevant, and non-digitised records] - code 1639 records[^fill in any null `cod`, fill in null `mil` for `true cod`, and add record summary in `data` if one in `cat` is weak (for relevant records)]
- code 1640 records
- code 1641 records
- code 1642 records
- code 1643 records
- code 1644 records
- code 1645 records
- code 1646 records
- code 1647 records
- code 1648 records
- code 1649 records
- code 1650 records
- code 1637 records
- code 1636 records
- code 1635 records
- filed AGI formulario de alta[^got credentials 2 jan 2025]
- code 1634 records
- code 1633 records
- code 1632 records
- code 1631 records
- code 1630 records
- code 1651 records
- code 1629 records
- code 1652 records
- code 1628 records
- code 1653 records
- code 1627 records
- code 1654 records
- code 1626 records
- code 1655 records
- code 1625 records
- code 1656 records
- code 1624 records
- code 1657 records
- code 1623 records
- code 1658 records
- code 1622 records
- code 1621 records
- code 1620 records
- code 1619 records
- code 1618 records
- give full citation for
lit_nshort cites - get prior log for 17th cent piracy- or defence-against-piracy events in Hond Bay work
- search
historical-series-re-Belize/work/versioning/events.tsvfor earliest file in/versioning/sheets varids.tsvwith requisite events records - requisite events record first pops up in varid 212 ie in 13-14 feb 2021[^1688 hostile action only, not specifically cited but likely not assumed (given vars press sources in sheet)]
- check
pre-git-hist/events.tsvfor feb 2021 and prior - requisite events record noted in
files/tran/The Baymen's experiment.rtfwritten 1-4 oct 2020[^1638 Hond settlement only, but no citation and ~~possibly~~ likely assumed] - and noted in
files/tran/THIS SETTLEMENT-TURNED-COL.rtfwritten 5 oct 2020[^again 1638 settlement only, ~~possibly~~ likely assumed] - and no other pre-feb 2021 records
- check events again for post-
v-1.4work[^events next touched *possibly* in `red/prelim rep bz 1.4 WORK *.tsv` on 7 nov 2021, then (sort of) in `wiki/Timeline_of_piracy_in_the_Bay_of_Honduras.xml` on 18 apr 2022 (with other wikis sort of also using events data but disregarded here in favour of only this one), then in `blue/orig data events *.tsv` on 18 may 2022, then in `purp/1638 series 1.2 Events*.tsv` on 9 jul 2022, then *possibly* in `red/rep bz fin pub 1.4 Events*.tsv` on 13 jul 2022, then in `purp/1638 series 1.3 Events*.tsv` and `purp/1638 series 1.5 Events*.tsv` on 18 jul 2022, then in `purp/1638 series 1.5 safe copy Events*.tsv` on 20 jul 2022, then in `harv/1638*.tsv` on 3 aug 2022, none further] - check for events work in
historical-series-re-Belize/work/*-22[^null] - check in
work/*-23[^null] - check in
work/*-24[^null] - get latest records for eligible events in
jsontopp-0/events[^except for `/wiki` records or data, to be added later, and assuming no change in `red/rep bz fin pub 1.4 Events*.tsv` nor `red/prelim rep bz 1.4 WORK *.tsv`] - write
events/readme.md - get
v-1.4events records into one sheet, but exclude records with neither upper nor lower bound in 17th cent - further remove meteorological and biological events
- add post-
v-1.4event records starting wih latest version inharv/1638*.tsv - check that added records use decipherable short citations[^yes in `blue/orig data sources 1.2*.tsv` (in `harv/1638 sources*.tsv` too) for records from `harv/1638 series*.tsv`]
- fix
idvalues, and assign missing ones[^non-lossy odd ids > 2163 assigned ie `e2165` to `e2183`] - fix null values[^keeping `xxx` in-text null/missing value placeholder]
- check for special
jsonchars - get
jsonstrings to singletsv - fix character encoding issue[^Sublime Text was bugging]
- figure out how to loop over[^with `for i in $(cat ids-list.txt); do jq '.[] | select (.id=="e'"$i"'")' events.json > "e$i.json"; done` for list of ids in `ids-list.txt` and array of all objects in `events.json`]
- extract single-object
jsonfiles - fill in
codvalues for allevents/e*.jsonfiles - check for
/eventscompleteness against missed post-v-1.4records in/wiki- fix newline occurrences[^with `sed -i '' 's: /\\n :\\n :g' * `]
- did up to 1605
- pull short citations before proceeding to fill in info gaps
-
review up to 1605 to make sure all refs are in record[^do later] - 1600s done
- 1610s done
- 1620s done[^used `jq -s '.' e*.json > temp.json` to merge events jsons into array in `temp.json`, then `jq '.[] | select(.key != null) | select(.key | contains("foo")) .id' temp.json` to query for `foo` value in `key` to find duplicates or events with tricky start/end dates]
- 1630s
- 1640s
- 1650s
- 1660s
- 1670s
- 1680s
- 1690s
- check for completeness against published lit[^eg ~~file for `/harv` full citations~~ `harv` citations not yet reviewed ie `jq '.[] | select (.via=="harv" and .short_cite==null and .old_short_cite==null)' citations.json`][^with `null` value for `via`, non-lossy odd ids > 2183 assigned ie `e2185` to `e2191`]
- add Dutch newspapers[^eg 1674 cruise in Amsterdam Courant no 23, etc]
- add French newspapers[^only 1687-88 attack in Paris Gazette no 24 found]
- add Cordingly Under the Black Flag[^a `harv` ref, but used in a `v-1.4` event]
-
add Dow Pirates of the New England Coast[^no 17th cent events] - add Salmoral Piratas, bucaneros, filibusteros [^a `harv` ref, but used in a `v-1.4` event]
- add Marley Sack of Veracruz
-
add Marley Pirates and Privateers[^2010 ed of this source covered] - add Skowronek X Marks the Spot
- add Moreau Pirates
- add missing press from
yell/data serial*.tsv
- get all short cites in
/eventsfortrueandnullcodvalues- for
harvdone - for
v-1.4 - get into
bibjsonfile[^`bib` seems too hard for this]- done for
harv - done for
v-1.4
- done for
- for
- add
/cartasand/eventscitations topp-0 - code
cartas- 1617
- 1616
- 1615
- 1614
-
1613 - 1612
-
1611 - 1610
-
1609 -
1608 - 1659
- 1660
- 1661
- 1662[^but some records to be coded left with `null` `mil` values due to lack of marginal notes/covering folio, may review later]
- 1663
- 1664
- 1665
- 1666
- 1667
- 1668
- figure out inclusion/exclusion used for
cartas[^from `v-1.4/Introduction*.tsv`, 993 AGI records in 1610-1680 from geographically relevant governor, audiencia, or cabildo, it seems; from `harv/1638 series 1.5-Cartas*.tsv`, 4227 AGI records in 1630-1679 at levels Us or Uc in GUATEMALA or MEXICO series, plus 20 AGI records from `purp/1638 series 1.3.numbers - Cartas-0*.tsv` in MEXICO,362 or MEXICO,359 in 1610-1681 from Yuc Gov, it seems] - check for
cartascompleteness[^ought to have all Uc or Us level AGI records with start or end date in 1630-1679 in GUATEMALA or MEXICO series, with possibly some (but likely not all) relevant AGI records for 1610-1629 and 1680-1681 dates, and likely none for other 17th cent dates]- Uc in GUAT for 1630 - yes
- Us in GUAT for 1630 - yes
-
Us and Uc in MEX for 1630[^assume yes]
-
extracteventsinto annualincidents- write
/incidents/readme.md - write null
incidentsjsons - fill in 1601 year
- fill in 1602 year
- fill 1603
- fill 1604
- fill 1605
-
fill to 1606-1610- fill 1606
- fill 1607
-
fill 1608
- check 1601-1607 years to ensure all relevant events are included in
noteskey - check duplicate/dubious events to 1607 do not have
trueincod
- write
-
fix duplicate or dubiousevents- add Garcia P Memorias[^per Lujan 2 472, 478]
- check all Lujan 2 472 refs are present
- check all Lujan 2 823-831 refs are present
- check events up to 1605 are sorted
-
sort 1606-1610 events- get PARES ids for all 1606 Uc and Us AGI records in GUAT[^31 records, but couldn't `curl` results, had to manually get `html` table of hits in `ids.txt`, then get ids by `grep -Eo 'value=.{0,8}' results.txt`]
- get
xmlfiles for them[^loop `for i in $(cat ids.txt); do curl -o $i.xml http://pares.mcu.es/ParesBusquedas20/catalogo/description/exportEAD/$i; done` for PARES ids in `ids.txt`] - code 1606-dated GUAT
cartasto sort out 1606 - sort 1606 events
- code 1607 dated GUAT
cartasto sort out 1607 - sort 1607 events
-
sort 1608 events
- get all missing
cartas- get PARES ids
- for GT Uc and Us for 1606[^31 hits]
- for MX Uc and Us for 1606[^169 hits]
- for GT Uc and Us for 1601-1605[^154 hits]
- for MX Uc and Us for 1601-1605[^791 hits]
- for GT Uc and Us for 1607-1629[^840 hits]
- for MX Uc for 1607-1629[^1476 hits]
- for MX Us for 1607-1629[^2695 hits]
- for GT Uc and Us for 1680-1700[^924 hits]
- for MX Uc and Us for 1680-1700[^1444 hits]
- clean up ids
- get
xmlfiles - make sure
curlmissed none[^do `for i in $(cat ids.txt); do [ -f xmls/$i.xml ] && echo yes - $i || echo no - $i; done` for files in `/xmls` and PARES ids in `ids.txt` - returns yes for 8493 ids, but got only 7135 `xml` files - so def vars duplicate ids, but a check (`sort ids.txt | uniq -c`) shows they show up all over the place rather than clustering cleanly - given that ids were written in the chronological order in which they we got them, this is prolly not mistake - duplicates prolly are duplicate hits from PARES from Uc records? - assume so] - clean up
xmlfiles- merge all
-
save merged file in UTF-8[^via TextEdit UI] - remove whitespace chars[^~~repeated spaces (`tr -s ' ' < foo.txt > tmp && mv tmp foo.txt`) and newlines (`tr -d '\n' < foo.txt > tmp && mv tmp foo.txt`)~~ manually]
-
get UTF charset[^original ISO-8859-1, but seems too messy to fix right now]
- get PARES ids from
cartas, to check cross check against the new ones[^7135 unique new ids and files, 5240 unique old ids and files - but 149 of the new ids already exist in old ids] - delete 149 duplicate new
xmlfiles[^6986 unique new files left] - get merged
xmlinnumbers, and extract relevant fields forcartasjsons - get to
cartasjsonformat[^c5240 to c12225 ids assigned] - add to
/cartas
- get PARES ids
- update relevant readmes
- code 1601 GUAT
cartas[^just to check `events` completeness, as no Guat events present for said year - none found] - fix duplicate or dubious
events, then extract into annualincidents[^combined from previous items]- for 1608
- 1609
- 1610
- 1611
- 1612
- 1613
- 1614
- 1615
- 1616
- 1617
- 1618
- 1619
- 1620
- 1621
- 1622
- 1623
- 1624
- 1625
- 1626
- 1627
- 1628
- 1629
- 1700
- 1690s
- 1680s
- check GUAT
cartas1683-1685 for trickyevents- 1683
- 1684
- 1685
-
check 1680-1685[^might do later]eventsprim and sec sources to sort them out - check 1680s duplicate
eventsto getnullorfalseincodkey, or correctcod_n
- check GUAT
- 1630s
- 1640s
- 1650s
- 1660s
- 1670s
-
check[^do later]incidentsagainstcartas- for 1650s[^since these are fully coded, do eg `jq '.[] | select(.cat.lbd!=null) | select(.cat.lbd|contains("165")) | select(foo) .id' temp.json` where `foo` is `.work.cod==null` to get any uncoded 1650s cartas, and is `.data.mil==true` to get only relevant ones, with all `cartas` in `temp.json`]
- are fully coded
- 12 relevant hits[^do `jq -j '.[] | foo .id' temp.json | sed 's/c/|/g;s/|/"id": "c(/;s/$/)",/'` to get relevant ids into Sublime Text regex string, for relevant query `foo` and all `cartas` in `temp.json`]
-
for 1640s[^do later] - for 1635, 1649, 1661-1664, 1668[^assuming fully coded cartas, these had no incidents per `jq -r '.[] | "\(.yr)\t\(.tally)"' temp.json | sort -n` for all incidents in `temp.json`]
- are fully coded
- 8 hits
-
for 1640-1648[^did to 1643 - do later 1644-1648]- are fully coded
- 25 hits
- for 1650s[^since these are fully coded, do eg `jq '.[] | select(.cat.lbd!=null) | select(.cat.lbd|contains("165")) | select(foo) .id' temp.json` where `foo` is `.work.cod==null` to get any uncoded 1650s cartas, and is `.data.mil==true` to get only relevant ones, with all `cartas` in `temp.json`]
-
for[^might do later]incidents, may count only certain defence acts (eg military/nautical acts) and not others (building forts/sending arms/naming officers/forming militia) as latter are messy to count? - or forcartas,defcould refer to state of defence generally or particular incidents, to plot defence mention incartasover time directly fromt them, or add acartastally toincidents
- start
pp.tex[^structure copied from latest `tex` file ie in `historical-series-re-Belize/work/oct-23/`] - continue pp[^input variables for dates/info relying on `incidents` or `/literature` data, or estimate them for draft]
- edit copy
- submit preprint[^`pp1` to SocArXiv on 2 mar 2025, plus `pp-0` repo made public on GitHub]
- preprint pubbed[^in SocArXiv on 3 mar 2025]
- requested comments from am*tor@yahoo.com[^on 5 mar 2025, got them on 6 mar 2025]
- fix vars minor mistakes
-
an name 'Tipu rebellion' sans capitals[^eg if event is unique to its class (ie no other rebellions in Tipu), and date markers stripped, and synonyms not accepted (eg 'Elizabethan era' properly names but 'Belizean independence' does not) -- note excluded] - general do not recommend deferring to popular senses/meanings of historical concepts eg of founding, settlement[^maybe only licensed if the proper sense is disputed, or if examined action is of proper sort (eg social/constructed, like marriage or illegal moves in chess, or necessarily intentional, or complex/compound), or if truth value of relevant historical claism turn on sense used, so on, but *not* deference out of respect for agents]
- might fix loose definitions/glosses[^if vs only if, qualifiers, so on — sort of fixed]
- note anachronistic use of term Belize[^plus maybe make use vs mention explicit, plus maybe note pp does not concern etymology, plus maybe note relevant periodisation (Sp to 1638, Precol to 1862)]
- add spaces to keywords in metadata
- note previous reviews/discussions on theories re Bz origins[^for Wallacian theories already noted, plus Bolland, Molina, Ancona, C Quijano, so on, plus maybe add first recognition of problem]
- note explicitly why Wallace ought to be read symbolically[^ie why we ought not discard all Wallace theories esp archival ones (eg historian with sources \\(x\\) and \\(y\\) saying 'there was piratical activity near Old in 1630s onwards' and 'Wallace settled Old in 1638' might give a Wallace theory, but if \\(y\\) is apocryphal, but \\(x\\) is not, historian might still sustain a similar theory on \\(x\\) alone)]
- add
draftdoc class option -
fix dates[^too tricky]
-
- get pp into historical prose in
pp2[^eg as sort of lit rev/historiographic discussion (maybe only build Map and modified Log as way of abstracting from vars theories/models in existing lit, to get at the general bare bones sans substantive interpretation)] - copy edit/finish
pp2 - submit
pp2preprint[^on 2 apr 2025, pub 4 apr 2025] - update preprint metadata stuff
- ask for comments[^fm res, gom, bul, bia, macp on 2 apr 2025 - got res comments on 3 apr 2025, got gom 'yes' on 3 apr 2025 - fm j burg (text) on 15 apr 2025 - got j burg comments (text) on 16 apr 2025]
- incorporate missing papers
- Gomez pp
- those wc might deny vagueness for some log/pir settlements
- for all
cod!=nullcartas, ensurepiranddefvalues accurately reflect criteria as newly specified incartas/readme.md- for
pir, assume this is so[^ie `true` for 'has evidence of piracy, even in just a few words' was the rule followed iirc, with `pir=null` and `mil=true` for similar incidents not obviously piratical (eg foreign ships cruising/watering/wintering/being sighted, foreign sailors/pirates landing/trading, etc) or else `pir` value just ignored on filling in `mil` value] -
for[^seems like a lot for little return - prolly only might get new coding from non-digitised cartas, as any mention of pir/def in catalogue data usually prompted `cod=true` iirc]defwherecod=false, check carta for non-incident defence[^ ie non-incident defence (eg general state of defence, defence proposals, etc) evidenced in *more* than a few words - incident defence already checked as `true` for 'has evidence of *particular* incident (rather than just general defence/proposals), even in few words' was the rule followed to now iirc, with `def=null` and `mil=true` for tricky incidents or else `pir` vlaue just ignored on filling in `mil` value][^3026 hits] - for
defwherecod=true,- if
mil=trueanddef=true, do nothing - if
mil=trueanddef!=true, makedef=null[^86 hits per `jq -r '.[] | select(.work.cod==true and .data.mil==true and .data.def!=true) .id' cartas-merged.json` then value swapped with `for i in $(cat hits.txt); do jq '.data.def=null' $i.json > tmp && mv tmp $i.json; done`] - if
mil=false, check carta for non-incident defence[^1056 hits - checked only by viewing `cat` and `data` with few exceptions]-
get regex stringgetsublcommands[^list filenames of all hits to get open via `subl foo.json fah.json` - can do `open foo.json fah.json` too] - check 1st batch
- 2nd batch
- 3rd batch
- 4th batch
- 5th batch[^excl genl def which was too broad (eg Barlovento Armada/Union de Armas stuff)]
- 6th batch
- 7th batch
- 8th batch
- 9th batch
- 10th batch
- 11th batch
- 12th batch
-
- if
mil=null,fix this mistakecheck carta for both newpiranddef[^~~as should have no `cod=true` and `mil=null` cartas~~ - got 35 hits, mostly cartas flagged for coding or deteriorated]- some finished
- for remaining cartas -
- marked
cod=nullif it seemed record hadn't been skimmed/only cat data had been read - left
cod=truewithmil=nulland addedstat=1for the rest[^2 cartas in Mex post-dating Bacl withdrawal from Bay (as prolly don't need these), and 5 damaged cartas]
- marked
- if
- for
-
check these cartas against PARES's defence/piracy relations/authorities?[^would incl non-Hond def/pir so migth be too much]
- for all years coded in previous log items, ensure no carta is
cod=null- for 1630-1679 dated cartas, exactly 1630-1668 dated cartas mentioned in prev items
- no
cod=nullin 1630s dated ones - none in 1640s
- none in 1650s
- none to 1668 inclusive
- no
- for other cartas, exactly 1601[^in Guat series only] and 1606-1607[^in Guat series only] and 1608-1629 dated cartas mentioned in prev items, but these do have vars
cod=nullcartas
- for 1630-1679 dated cartas, exactly 1630-1668 dated cartas mentioned in prev items
- code missg cartas with
milexpanded as above- for 1601[^find and open via `jq -j '.[] | select(.cat.lbd!=null) | select(.cat.lbd|contains("1601")) | select(.work.cod==null) | "\(.id).json "' temp.json | sed 's/c/open c/'`]
- for 1606-1607[^but leave `cod=null` for non-Yucn Mex cartas, as 18/19 `mil=true` Mex cartas have 'Yucatán' string in `scope`, and 1/19 has Campeche' instead - so added filer `select(.cat.scope!=null) | select(.cat.scope|contains("Yucat") or contains("Campec"))`]
- for 1608-1629[^only select Mex cartas as above]
- code 1602-1605 cartas with
milas above- all in Guat
- select in Mex[^as above]
- code 1669 cartas as above
- all in Guat
- select in Mex
-
code 1670s cartas as above-
all in Guat -
select in Mex[^prolly not needed after Bacl withdrawal]
-
- so 1600s to 1660s fully coded[^with 135 `mil=true`]
- restructure
jsonfiles[^for easier data entry]- set all
tallyto null,[^add by machine, not by hand] - add
incidentskey with empty array value, - remove
incident0objects with all-null values, - add
nokey inside incidents objects with matching integ - move incidents key-value pairs as just objects to
incidentsarray
- set all
- get a good incl/excl test for def incidents, or just incl them all with
pir=false[^keep `incidents/readme.md` conf test (looking for unique, well-sourced, non-compound piracy or defence-against-piracy incidents in Hond Bay in relevant 17th cent year), so for sure exclude defence against non-pirates, excl state of def/def proposals/def orders/genl def stuff, and possibly excl routine/minor def acts (receiving/restocking arms, receiving men)]- keep strict
conftest fromincidents/readme.md - add
possiblearray key to all incidents to store acts which may be later discounted
- keep strict
- screen incidents against
- events to ensure all events cited and all incidents extracted
- all cited
-
main hits[^88 `cod!=false` hits] - duplicates - set
cod=null[^4 `cod=false` and `cod_n!=null` hits] - main hits - all cited[^92 `cod!=false` hits w duplicates now]
-
- all needed
infoacts extracted to incidents[^~~seems not needed if event is cited~~ possibly some previously excluded acts could go in `possible` array - all info from events not needed though]- sort some existing acts into
possible[^eg arrival of reinforcements] - for 1600s[^and added duplicate event ids]
- for 1610s
- for 1620s
- for 1630s
- for 1640s
- for 1650s
- for 1660s
- for 1670s
- for 1680s
- for 1690s
- for 1700
- sort some existing acts into
- all cited
-
published lit cited in events to get cleaner incidents[^rely on text in events files, or do later] - cartas to ensure all incidents extracted and to get cleaner incidents[^did not input vars `possible` acts eg sending arms]
- for 1601
- for 1602
- 1603
- 1604
- 1605
- 1606
- 1607
- 1608
- 1609
- 1610
- 1611
- 1612
- 1613-1615
- 1616
- 1617
- 1618
- 1619
- 1620s
- 1630s
- 1640s
- 1650s
- 1660s
- the rest[^only had some 1680s cartas]
- events to ensure all events cited and all incidents extracted
-
make sure acts in[^do after draft]incidentsare exactly ones to tally - comb through incidents again to check:
- all def acts are
pir=false - no def act has embedded pir acts[^but pir ones allowed to have embedded def ones]
- pir acts are bundled consistently
- geographically[^Bay Isl vs Rattan, Amatique vs G Dulce, Omoa vs Pto Caballos, etc]
- by type[^cruise during raid vs sans raid, etc]
- all def acts are
- assign ids[^keep `.v-1.4_id`, keep *lower* `.v-1.4_id` in case record has > 1 of same odd/even type, plus assign odd ids > 583 for deposited series or collections, and even > 1628 else - assigned 7 odds s585 to s597, and 342 evens s1630 to s2312]
- get all records into
/citations[^ought to have 1567 records (460 harv, 1107 v-1.4), but got 1456 (349 harv, 1107 v-1.4)- so 111 harv duplicate overwritten by their v-1.4 sources (as v-1.4 extracted after harv)] -
get urls, old short cites into single arrays - add .versions array for versions
-
delete null values -
fix cite values forv-1.4[^all to `.old_cites`] -
delete/merge duplicates[^currently 109 harv—v1.4 duplicates in `citations.json` (with only their v-1.4 copies in `/citations`) and 1 triplicate in `citations.json` (`s526`)]- recover
citations/s526.jsontriplicate[^had 2 harv copies, 1 v-1.4 copy in `citations.json` - copied missing copies to `citations/s526.json` in invalid structure] - copy missing harv records for 109 duplicates
- all harv and v-1.4 copies from
citations.jsonnow in/citations -
fix{^do later}s526and all duplicates into single object
- recover
-
merge separate items which are versions of one another[^do later] - write readme
- remove
citations.json - get list of all currently used short cites, and fix corresponding files in
/citations- 1 in
/cartas - ca 200 in
/events- ca 20 in notes[^assuming preceded by `/`, plus ignoring vars missg ones]
- ca 180 in other fields
-
vars in[^ignore for now]/citations - none in
/incidents[^iirc] - ca 200 short cites need checking in
/citations[^stored in untracked `scrap.txt`]
- 1 in
- add source files to readmes
- in
/events - in
/cartas - in
/citations- added -
in[^not imported]/incidents
- in
- check file lengths[^1 = single object]
- in
/events- all of 1 - in
/cartas- all of 1 - in
/citations- 1 of 3, 109 of 2, rest of 1 - in
/incidents- all of 1
- in
- figure out prior log of lit/discordance review[^not incl notes/citations in corresponding chronicle data ie incl only dedicated/explicit/structured lit/discordance review]
- in
/pre-git-hist- vars[^per `events.txt`, looks like first appears in `files/wiki/Draft-English_settling*.xml` written 26 Apr 2022, next (in spirit only) in `blue/analysis or state of historiography.rtf` fm 2 Jul 2022, next (more copy than data really) in `purp/disc amb 1.1*` fm 7 Jul 2022, then in `purp/1638 series 1.2 Discordance*.tsv` fm 9 Jul 2022, then in `purp/1638 series 1.5 Discordance*.tsv` fm 18 Jul 2022, then in `purp/1638 series 1.5 safe copy Discordance*.tsv` saved 20 Jul 2022, plus prolly `purp/1638 art 1.1.tex` fm 21 Jul 2022, and desposited `harv/1638 series*.tsv` on 3 Aug 2022] - in
/historical-series-re-Belize- in
/work/*-22- none - in
/work/*-23- none - in
/work/*-24- none
- in
- in
- write prior log, and extract latest version to
json- added prior log
- extract latest version from files[^140 from 45 sources from `harv/1638 series*.tsv` - `d*` quote ids ignored in favour of corresponding `s*` source ones]
- write
/literature/readme.md - replace
.attvalues with boolean ones- null - 67 swapped
-
0and1and2- 15 and 0 and 0 swapped -
0nand0p- 1 and 0 swapped -
1nand1p- 1 and 17 swapped -
2nand2p- 16 and 23 swapped - import into
/literature
- cross check vs wiki for missg records
- check for all 23 quotes in wiki table[^vars present in `/citations` but missg `upd`/`lbd`/`desc` values from wiki]
- missing no 3 Roys s396
- missg no 17 G Diaz s2270
- missg no 18 C Quijano s418
- add missg quotes above to
/citations[^w no `via` values]- files added
- check wiki xml to see if tag is
wiki- none present
- check for all sources cited in wiki table[^ignored most instances of sources already in `/citations`] - 13 missing from
/citations- add 3 sources to existing files[^missing actual quotes]
- add 10 sources to new files[^missg actual quotes - plus added `citations/s2314.json` for new non-deposited source (Parsons) with `via:null`, and flagged duplicate in `note`]
- check for all 23 quotes in wiki table[^vars present in `/citations` but missg `upd`/`lbd`/`desc` values from wiki]
- check
harvand live wiki references to get missg sources not yet cited in live wiki nor/literature- found 6 in wiki - check
/citationsfiles are there and update[^plus added `s2316.json` for non-deposited `via:null`, and flagged duplicate in `note`] - found 7 in
harv[^added to `scrap.txt`]
- found 6 in wiki - check
- check other places for missing
/literaturefiles[^added to `scrap.txt`]- found 2 in prior rdgs extracted to Notes/iA Writer[^listed in `historical-series-re-Belize/work/dec-22/` - note, vars events (and therefore incidents) missg from sources linked here]
- found 0 in citations to B-Thomas, Restall aa
- found 1 from prior recollection
- found 0 citations in
e341, 5 ine399
- update
/literaturefiles from above gatheredinvalues- for Campbell aa
- for G Diaz
- for C Quijano[^`in` values only]
- for V Barke aa [^`in` values only]
- add missg
/literaturefiles from above- for Davidson[^no quotes added as claims re only Bay Isl]
- for Naylor[^no quotes added as claims re only Mosq Shr]
- for Ibarra[^missg actual quote]
- for all others[^no `quote` values, and no `in` values for some]
- update
/citationsfiles as needed per above[^plus added `citations/s2318.json` with `via:null`] - fill in missg
/literaturevalues[^incl quotes on Tortuga/Mosq Shr/Bay Isl/Camp/E Yucn as lots of discordance in lit re these, and likely to need later on]- for original claims (as per live wiki)
- incl for non-Hon stories, and incl non-quote values
- for original claims (per recollection)
- in Campbell aa
- in Campbell bb[^added as `citations/s2320` with `via:null`]
- in Restall
-
plus missed Long (plus Restall rejection)
- for others[^incl eg Long and any missed eg in `work/dec-22/`]
- added vars already in
/citations, plus added some not in/citationsyet[^ie added `s2322` with `via:null`]
- added vars already in
- for original claims (as per live wiki)
- cross check entered values above against source material to make sure they're accurate[^plus ensure thoroughness, and tag lit review/historiographic discussions of Bz origins, and order quotes for easier reading]
-
for AMN[^is not print] - Aliphat
- Amandala
- Ancona
- Asturias
-
Asturias aa[^did not have copy] - Avery
- Azpiroz
- B-Thomas
- BL EG
- Bancroft
- Bolland C
- Bolland F
- Bolland N
- Breuer
- Bridges
- Britannica
- Brockhaus
-
Burdon[^had no copy] -
cite:C*sources - 10 files[^added `s2324` to `/literature` and `/citations` with `via:null`] -
cite:D*sources - 3 files - sources possy with lit rev
- G Palaez
-
Molina[^had no copy] - Peniche
- Restall aa
- Ribot
- vars early print sources[^earliest *implicit* notice of problem re Bz origins seems like Hon Alm for 1826 or 1829, and earliest *explicit* possy McCulloch or Fenix, and earliest lit rev seems like N Ortega]
- Dampier
- Sloane
-
Reads[^no copy] - Long
- Robertson
- Henderson
- Coxe
- Hon Alm
- Bridges
- Temple aa
- Brockhaus
- Levy
- N Ortega
- more possy lit rev sources
-
Ancona[^done] - Gibbs
-
Hon Hnd[^no 1925 copy] -
Pavia - Runio B
-
Molina[^no copy] - Donohoe
- Winzerling
- Curry
- Krushus[^added `s2326` to `/literature` and `/citations` with `via:null`]
-
-
- tag literature review/historiographic sources in
note - add missing sources eg Caiger (s1334), Dobson, Ayearst, so on
- in
v-1.4, first written by 7 sep 2021, last updated2429 oct 2021 still in belmopan, cy[^varids 75 and 11 in `work/versioning/sheets varids.tsv`] - then worked on 9 jul 2022 to 3 aug 2022 in boston, ma[^in `purp/1638 series *.tsv` and `purp/PARES-*.txt` and `purp/1638 cartas *.tsv` and `yell/data manuscripts *.tsv`.]
- no work since then
- in
v-1.4, first written by 14 feb 2021, last updated 31 oct 2021 still in belmopan, cy[^varids 212 and 5 and 4, not counting 1638 settlement incident noted sans citation in earlier files] - then worked on 18 apr 2022 to 3 aug 2022 in boston, ma[^ discounting (on a hunch) `red/prelim rep bz 1.4 WORK *.tsv` and counting `wiki/Timeline_of_piracy_in_the_Bay_of_Honduras.xml` (think it has original records), through prev listed files up to `harv/1638*.tsv` - but relevant events in wiki article worked up to 10 oct 2022]
- no work since then[^missg vars from `historical-series-re-Belize/work/dec-22/` as of 19 feb 2025 1.24 pm]
- in
v-1.4, none - then worked 26 apr 2022 to 3 aug 2022 in boston, ma[^in `files/wiki/Draft-English_settling*.xml` through prev listed files up to `harv/1638 series*.tsv`, but wiki article worked up to 14 nov 2022]
written 23 dec 2024 in benque, cy