8,034
edits
(Reorganized discussions by main topic) |
|||
Line 20: | Line 20: | ||
== Discussions == | == Discussions == | ||
* Discuss Hotspot permanent (across SD update) storage | |||
==== Hotspot ==== | |||
*Discuss Hotspot permanent (across SD update) storage | |||
*Discuss Hotspot maintenance policy: when/under which circumstances do we stop maintaining a version/deployment/HW? Whats our limit on client support (browser version for instance)? | *Discuss Hotspot maintenance policy: when/under which circumstances do we stop maintaining a version/deployment/HW? Whats our limit on client support (browser version for instance)? | ||
*What is/are the perfect(s) hotspot(s) hardware configuration? | |||
*Should we better serve companies (vs NGOs / Foundations)? | |||
*Should we better serve preppers (what is the outcome of the custom prepper image offering?)? | |||
*Should we sell all-in-one hotspot configurations? What is it? hardware + ZIMs? virtual machine? continuous updates? | |||
*What is/are the perfect(s) hotspot(s) hardware configuration? | ==== Catalog management ==== | ||
*Should we better serve companies (vs NGOs / Foundations)? | *Discuss relevance of per-scraper Tag in ZIM metadata | ||
*Should we better serve preppers (what is the outcome of the custom prepper image offering?)? | *Have Content Team present Tag/Category strategy (including i18n issue) | ||
*Should we sell all-in-one hotspot configurations? What is it? hardware + ZIMs? virtual machine? continuous updates? | *Discuss availability of content: once we've started to provide a content, do we consider we have to do our best to continue to provide it and update it on a regular basis | ||
==== Scraping ==== | |||
*Merge zimit / warc2zim? Merge youtube / ted ? Merge all python scrapers ? | *Merge zimit / warc2zim? Merge youtube / ted ? Merge all python scrapers ? | ||
*How tolerant are we with item failures in scraper? In other words, is it preferable to produce a ZIM with only 99% of source content because the scraper fails to process 1% of them, or do we want to target 100% or nothing? mwoffliner is very strict, one failing article and the scraper stops. zimit is very permissive, there is no limit in number of failed pages. iFixit has a middle ground, allowing to configure the percentage of failed items. | *How tolerant are we with item failures in scraper? In other words, is it preferable to produce a ZIM with only 99% of source content because the scraper fails to process 1% of them, or do we want to target 100% or nothing? mwoffliner is very strict, one failing article and the scraper stops. zimit is very permissive, there is no limit in number of failed pages. iFixit has a middle ground, allowing to configure the percentage of failed items. | ||
==== Cross-projects ==== | |||
*Discuss Support Policy: document existing non-written or loosely written: apple, etc and define/document for public services (library.kiwix.org) | |||
==== Meta/org ==== | |||
*Presentation of last boards slidewares and outcomes (redacted from confidential information if any) | |||
*Retrospective last year | |||
*Roadmap for 2024, for the next 2 years, 5 years ; post-mortem | |||
== Achievements == | == Achievements == |
edits