# [draft] Local offline internet slices: Bringing pieces of the internet to our communities using WebArchives having your own 'internet archive', so you can have a copy of the websites your community wants to have access to, locally. It would be useful for example if the community network is locally fast, but don't have access to the internet (or a very limited one). this system could cache the websites, or someone could go to the city and get the websites for the rest, and then have this 'internet archive' accesible for the whole community: Things to consider: * what to fetch * how to fetch * how to acccess * how to update ## fetching content There are many ways to fetch content: 1. using the pywb recording capabilities 2. using the desktop app [Conifer](https://conifer.rhizome.org/) 3. Using an automated mechanism 4. Using proxies ### Advanced https://github.com/internetarchive/warcprox ## browsing content https://github.com/webrecorder/pywb ## updating content ## More info Kiwix is an end-to-end solution to this exact problem - downloading webpages into .zim files, scrapers/crawlers, local viewers, web servers that share them, running on Raspberry Pi, etc see https://www.kiwix.org/en/ :) web archive package/bundle format: https://github.com/webrecorder/web-archive-collection-format Potential funders: * https://mellon.org/programs/public-knowledge/ https://webrecorder.net/embed-demo-2.html https://webrecorder.net/embed-demo-1.html https://gsuite.google.com/marketplace/app/replaywebpage/160798412227 https://pywb.readthedocs.io/en/latest/manual/usage.html#using-pywb-recorder