A

About half a dozen dependencies have patches that have yet to be merged upstream.

Other
Client wants to use Intertwingler as a back-end.
Get in touch with the maintainers of the respective packages and get them to merge the patches.
[DONE] Ship Intertwingler as a Docker image.

Adding a cache field to Store::Digest will require changing the metadata record, which in turn will create an incompatibility.

Has Broader
Changing the metadata layout in Store::Digest will make it incompatible with already-initialized stores.
Other
Create a variant of the metadata driver for both the old and the new version of the database layout.
Extend Store::Digest to include a way to indicate that an entry ought to be treated as cache.

Alias privatealpha.com to methodandstructure.com.

Has Broader
Put methodandstructure.com into Intertwingler.

Arrange for each site under management to have its own RDF store.

Other
A separate RDF store per site will make a big mess with Intertwingler's configuration data.
Intertwingler is currently sharing the same RDF store for all sites under management.
Separate RDF stores are the second most reliable way to keep data from leaking across sites under management, second to separate Intertwingler instances altogether.
Separate stores will consume considerably more RAM and drive space.
[RESOLVED] As it stands, every site managed by Intertwingler can see the data from every other site.

Associate private cache entities with the principal that requested them.

Other
How do we handle private cache entries?
What about private cache entities that also happen to match public responses?

Associate the cache entity with a null principal to indicate that it is actually public.

Other
What about private cache entities that also happen to match public responses?

ATProto speaks XRPC (over WebSockets) while Intertwingler is very heavily REST.

Other
Create an adapter that will map between ordinary HTTP resources and WebSockets.
Make Intertwingler an ATProto PDS.

Attach "next" and "previous" relations to the window resources, when applicable.

Other
How does a client determine how to fill out the entire collection?

The address space is already occupied by the old statically-generated sites and the old App::IBIS prototype.

Other
Get Intertwingler the hell online.
Migrate App::IBIS content to Intertwingler/Sense Atlas.

[ADOPTED] Create a generic inventory resource that can take any combination of filtering parameters.

Has Broader
[DONE] Create a set of "catalogue resources".
Has Narrower
Paginate the result set.
[ADOPTED] Treat the `instance-of`, `in-range-of`, `in-domain-of` parameters as a union, that is, select all resources that match any of them.
[DONE] Add an `instance-of` parameter that takes class identifiers for parity with the original resource.
[DONE] Create an `asserted` flag that is on by default but can be set to false to eliminate asserted resources from the result set.
[DONE] Create an `inferred` flag that is on by default but can be set to false that will disable inferencing.
[DONE] Make the inventory resource respond to an `in-domain-of` parameter for symmetry with `in-range-of`.
[DONE] Perform RDFS/OWL inferencing against the supplied parameters and result set.
[DONE] Put the `in-range-of` property into parameter.
Other
An inventory that takes alll these paramaters does much more than just display resources in the range of a given property.
How are the parameters reconciled if you have multiples at once?
How do you make sure that you net resources that are subclasses of, or related by subproperties to, the supplied parameters?
How is the inventory going to be related to the index?
To populate potential linkages, the tool needs a list of all resources that are in the range of a given property.
What if the result set is enormous?

[ADOPTED] Treat the `instance-of`, `in-range-of`, `in-domain-of` parameters as a union, that is, select all resources that match any of them.

Has Broader
[ADOPTED] Create a generic inventory resource that can take any combination of filtering parameters.
Other
How are the parameters reconciled if you have multiples at once?

B

Bust out the profiler and fix whatever's causing RDF::Graph to run (INSANELY) slow.

Other
RDF::Graph slows down operations by two orders of magnitude!
We have a good relationship with the Ruby-RDF maintainer, who very promptly merges and pushes out our patches.

C

Can the resource identified by the hash URI be the cache entry?

Other
Create an ontology based on PROV-O and the HTTP message ontology that can represent the appropriate metadata.
It is possible for two different requests to produce the same cache entity.

Change the LMDB driver on Store::Digest so the main entry table uses integer keys rather than the much larger "primary" hash.

Has Broader
Overhaul Store::Digest.
Has Narrower
Create a variant of the metadata driver for both the old and the new version of the database layout.
Fix the C code in the Ruby LMDB bindings so that read-only transactions can "nest" (i.e., just no-op and use the parent transaction).
Other
Changing the metadata layout in Store::Digest will make it incompatible with already-initialized stores.
It turns out that there is a bug in the LMDB bindings that keeps nested transactions from working properly.
Store::Digest uses a baroque and inefficient layout for its metadata.

Changing the metadata layout in Store::Digest will make it incompatible with already-initialized stores.

Has Narrower
Adding a cache field to Store::Digest will require changing the metadata record, which in turn will create an incompatibility.
Other
Change the LMDB driver on Store::Digest so the main entry table uses integer keys rather than the much larger "primary" hash.
Create a mechanism that upgrades old metadata database layouts to the new one.

Client is not a Ruby developer.

Other
Client wants to use Intertwingler as a back-end.
[DONE] Ship Intertwingler as a Docker image.

Client wants to map IBIS issues to GitHub issues.

Has Broader
Client wants to use the IBIS tool (Sense Atlas) in everyday operations.
Other
Create an API bridge that will connect the two systems.

Client wants to use Intertwingler as a back-end.

Has Narrower
Client wants to use the IBIS tool (Sense Atlas) in everyday operations.
Other
About half a dozen dependencies have patches that have yet to be merged upstream.
Client is not a Ruby developer.
[DONE] Ship an Intertwingler instance that the client can use.

Client wants to use the IBIS tool (Sense Atlas) in everyday operations.

Has Broader
Client wants to use Intertwingler as a back-end.
Has Narrower
Client wants to map IBIS issues to GitHub issues.
Get rudimentary multi-site support working on Intertwingler.
Plan projects and resources with the use of the Process Model Ontology.
Other
Ship a front end with interfaces for IBIS, PM, and a rudimentary FOAF/ORG editor.
[DONE] Port the IBIS tool to Intertwingler.

Create a background worker thread that polls regularly for expired cache entries.

Has Broader
Run the cache eviction process in another thread.
Other
How do we actually go about evicting expired cache entries?
If you have multiple processes accessing Store::Digest, this means multiple cleanup threads.

Create a configurable "pecking order" of URI schemes so the precedence is unambiguous.

Has Broader
Overhaul the resolver so that it has pluggable transformations between durable identifiers and routable addresses.
Other
What happens if a resource has more than one durable identifier?

Create a mechanism that upgrades old metadata database layouts to the new one.

Has Broader
Overhaul Store::Digest.
Has Narrower
Detect what version the metadata layout is and whine at the user to upgrade it on their own time.
Other
Changing the metadata layout in Store::Digest will make it incompatible with already-initialized stores.
We don't really have time to write migration code and we're not sure how many there are out there in the wild.
We don't want to take people by surprise by forcing them into upgrading their data store.

Create a server-side XSLT transform.

Other
Google has, in fact, committed to eliminating support for XSLT.
We were relying on browser cache to keep XSLT from being intolerably slow.

Create a variant of the metadata driver for both the old and the new version of the database layout.

Has Broader
Change the LMDB driver on Store::Digest so the main entry table uses integer keys rather than the much larger "primary" hash.
Extend Store::Digest to include a way to indicate that an entry ought to be treated as cache.
Has Narrower
Detect what version the metadata layout is and whine at the user to upgrade it on their own time.
Refactor Store::Digest::Meta and Store:Digest::Blob so that they are classes instead of mixins so they can be subclassed, and make the driver wrap them.
Other
Adding a cache field to Store::Digest will require changing the metadata record, which in turn will create an incompatibility.
For some reason I made Store::Digest with all mixins that can't do inheritance.

Create an "inventory" resource that is itself a list of resources with the characteristic of being in the range of a given property.

Has Broader
[DONE] Create a set of "catalogue resources".
Has Narrower
[DONE] Add an `instance-of` parameter that takes class identifiers for parity with the original resource.
[DONE] Make the inventory resource respond to an `in-domain-of` parameter for symmetry with `in-range-of`.
[DONE] Put the `in-range-of` property into parameter.
Other
An inventory that takes alll these paramaters does much more than just display resources in the range of a given property.
There are all manners of properties, so hard-coding a single one is silly.
To populate potential linkages, the tool needs a list of all resources that are in the range of a given property.

Create an adapter that will map between ordinary HTTP resources and WebSockets.

Other
A REST-to-WebSockets adapter will make it easy to write handlers (microservices).
ATProto speaks XRPC (over WebSockets) while Intertwingler is very heavily REST.

Create an API bridge that will connect the two systems.

Other
Client wants to map IBIS issues to GitHub issues.
How do you safely store API credentials?
What about changes that happen on GitHub?

Create an ontology based on PROV-O and the HTTP message ontology that can represent the appropriate metadata.

Has Broader
Put the cache metadata into the RDF store.
Other
Can the resource identified by the hash URI be the cache entry?
How are we going to represent the metadata for the cache entries?
See Also

Create an `etime` index for cache expiry; only put actual tombstones in the `dtime` index, and only put cache entries in `etime`.

Has Broader
Reuse the `dtime` field and just set it to the future.
Other
Mixing cache and non-cache entities in the `dtime` index will make it necessary to parse and scan each entry when it comes time to expire cache entries.

D

A dedicated cache metadata store will be much more efficient in many ways than trying to use RDF.

Other
Index the cache entries using a dedicated key-value store.

Dedicate some time to hunting down and squashing memory leaks and general wasteful memory usage.

Other
There are almost certainly memory leaks in and around Intertwingler and its snarl of dependencies.

Design a compact XSLT syntax and concomitant transpiler.

Other
The XSLT component kind of sucks.
XSLT (any version) would be a million times easier to work with if it was just easier to type.
See Also

Design a mechanism for offloading and/or paginating backlinks.

Has Narrower
Make an addressable transform of the form `;backlinks=property,window-start,window-end` —one for each property (that runs long).

Despite being standard, reliable, and fast to execute, and built into every browser, XSLT 1.0 in particular is really clunky to work with.

Other
The XSLT component kind of sucks.
Try messing around with SaxonJS?

Detect what version the metadata layout is and whine at the user to upgrade it on their own time.

Has Broader
Create a mechanism that upgrades old metadata database layouts to the new one.
Create a variant of the metadata driver for both the old and the new version of the database layout.
Has Narrower
Once again, pass in an initialization parameter to explicitly upgrade the store.
Yes. Supply an initialization parameter to ignore the whining.
Other
How does the user assent to upgrading the metadata store?
Should the user be able to silence the whining?
We don't want to take people by surprise by forcing them into upgrading their data store.

Do we actually want to keep this domain alive?

Other
Let's keep makethingsmakesense.com and just redirect it for now.
Put makethingsmakesense.com into Intertwingler.

Do we actually want/need really busy, chatty cache metadata in the RDF store?

Other
Put the cache metadata into the RDF store.

doriantaylor.com only needs a basic notion of "private" (only visible to me) and "public".

Other
Implement a rudimentary mechanism for partitioning the graph into "public" and "private".

[DONE] Add an `instance-of` parameter that takes class identifiers for parity with the original resource.

Has Broader
Create an "inventory" resource that is itself a list of resources with the characteristic of being in the range of a given property.
[ADOPTED] Create a generic inventory resource that can take any combination of filtering parameters.
Other
The original inventory resource in the prototype goes by instance of class rather than range.

[DONE] Clean up slow RDF::Graph methods and push upstream.

Has Broader
Put site-specific data in a named graph that corresponds to the authority, then narrow queries to that named graph.
Other
RDF::Graph slows down operations by two orders of magnitude!

[DONE] Create a Git repository with a prefabricated configuration.

Has Broader
[DONE] Ship an Intertwingler instance that the client can use.
Other
What if the client has trouble?

[DONE] Create a set of "catalogue resources".

Has Broader
[DONE] Port the IBIS tool to Intertwingler.
Has Narrower
Create an "inventory" resource that is itself a list of resources with the characteristic of being in the range of a given property.
Represent windowed slices of a collection as ephemeral entities. Have them refer back to the main collection with the unmanageably many elements.
When somebody tries to access the main collection, redirect them to the first window onto the collection.
[ADOPTED] Create a generic inventory resource that can take any combination of filtering parameters.
[DONE] Create an index as a top-level entry point to the catalogues.
[DONE] Define a relation that connects the tool to the entry point.
Other
How does the front-end know where the catalogues are?
How will we construct the UI of the RDF-KV forms?
What happens when you have an extreme quantity of entities for a given class and/or in the domain or range of a given predicate?
[RESOLVED] Intertwingler currently does not have the resources to furnish the IBIS tool with fodder for autocompletes.

[DONE] Create an "index" resource that enumerates all the windowed resources.

Other
How does a client determine how to fill out the entire collection?

[DONE] Create an index as a top-level entry point to the catalogues.

Has Broader
[DONE] Create a set of "catalogue resources".
Has Narrower
[DONE] Define a relation that connects the tool to the entry point.
Other
How does the front-end know where the catalogues are?
To populate potential linkages, the tool needs a list of all resources that are in the range of a given property.

[DONE] Create an RDF-KV handler.

Has Broader
[DONE] Ship an Intertwingler instance that the client can use.
Other
How will we construct the UI of the RDF-KV forms?
What about writing to the graph?

[DONE] Create an `asserted` flag that is on by default but can be set to false to eliminate asserted resources from the result set.

Has Broader
[ADOPTED] Create a generic inventory resource that can take any combination of filtering parameters.
Other
What if you wanted only inferred results with no asserted ones?

[DONE] Create an `inferred` flag that is on by default but can be set to false that will disable inferencing.

Has Broader
[ADOPTED] Create a generic inventory resource that can take any combination of filtering parameters.
Other
What if we only want resources that match assertions with no inferencing?
What if you wanted only inferred results with no asserted ones?

[DONE] Decouple the IBIS front-end from its back-end (into its own Git[hub] repository).

Has Broader
[DONE] Port the IBIS tool to Intertwingler.
Other
[RESOLVED] The front-end code to the IBIS tool is currently in the back-end repository.
See Also

[DONE] Define a relation that connects the tool to the entry point.

Has Broader
[DONE] Create a set of "catalogue resources".
[DONE] Create an index as a top-level entry point to the catalogues.
Other
How does the front-end know where the catalogues are?
See Also

[DONE] Generate an ARM64 Docker image.

Has Broader
[DONE] Ship Intertwingler as a Docker image.
Other
It turns out to that the default (AMD64) build is unusably slow in the client's development environment (Darwin/ARM64).

[DONE] Install Intertwingler on deuce (which is ostensibly a much, much faster machine than that piddly little VPS on DigitalOcean) and just reverse-proxy it over the VPN.

Has Broader
Get Intertwingler the hell online.
Other
How do we handle authentication and access control?
I don't love having my dev server conscripted into production.
I have fibre and get a 1.7ms ping time to the server, even over ipsec.
It turns out the server at DigitalOcean is actually really slow.
The FastCGI daemon running Intertwingler keeps exiting; there is something not right with it.
We can shell out for proper infrastructure when this thing starts making some money.

[DONE] Make the Intertwingler home directory a Docker volume.

Has Broader
[DONE] Ship Intertwingler as a Docker image.
Other
How do we get instance data in and out?

[DONE] Make the inventory resource respond to an `in-domain-of` parameter for symmetry with `in-range-of`.

Has Broader
Create an "inventory" resource that is itself a list of resources with the characteristic of being in the range of a given property.
[ADOPTED] Create a generic inventory resource that can take any combination of filtering parameters.
Other
Properties go in the other direction too, so it would be useful to be able to pick subjects of properties as well as objects.

[DONE] Perform RDFS/OWL inferencing against the supplied parameters and result set.

Has Broader
[ADOPTED] Create a generic inventory resource that can take any combination of filtering parameters.
Other
How do you make sure that you net resources that are subclasses of, or related by subproperties to, the supplied parameters?
What if we only want resources that match assertions with no inferencing?

[DONE] Port the IBIS tool to Intertwingler.

Has Broader
Put makethingsmakesense.com into Intertwingler.
[DONE] Put client extranets into Intertwingler.
Has Narrower
[DONE] Create a set of "catalogue resources".
[DONE] Decouple the IBIS front-end from its back-end (into its own Git[hub] repository).
Other
Client wants to use the IBIS tool (Sense Atlas) in everyday operations.
We would actually be able to represent real sequences of actions instead of just suggestions about what to do.
We would actually be able to unblock development on the IBIS tool that has been sitting idle for over a decade.
[RESOLVED] Intertwingler currently does not have the resources to furnish the IBIS tool with fodder for autocompletes.
[RESOLVED] The front-end code to the IBIS tool is currently in the back-end repository.

[DONE] Put client extranets into Intertwingler.

Has Broader
Put methodandstructure.com into Intertwingler.
Has Narrower
[DONE] Port the IBIS tool to Intertwingler.

[DONE] Put Sense Atlas on its own website, senseatlas.net.

Other
Intertwingler currently lacks any internal caching, so I don't trust it standing up to even the baseline thrash of the open internet.
Where should Sense Atlas go?

[DONE] Put Sense Atlas online so the client can observe its development.

Has Broader
Rough out the new classes and try loading them with instance data; take notes for what can be improved.
Has Narrower
Get Intertwingler the hell online.
Leave Forget Passwords on the DO server, and hack both it and Intertwingler, respectively, to transmit and receive a JWT.
[DONE] Upgrade DigitalOcean server to handle the additional load introduced by Intertwingler.
Other
How do we handle authentication and access control?
Where should Sense Atlas go?

[DONE] Put the `in-range-of` property into parameter.

Has Broader
Create an "inventory" resource that is itself a list of resources with the characteristic of being in the range of a given property.
[ADOPTED] Create a generic inventory resource that can take any combination of filtering parameters.
Other
Properties go in the other direction too, so it would be useful to be able to pick subjects of properties as well as objects.
The original inventory resource in the prototype goes by instance of class rather than range.
There are all manners of properties, so hard-coding a single one is silly.

[DONE] Ship an Intertwingler instance that the client can use.

Has Narrower
[DONE] Create a Git repository with a prefabricated configuration.
[DONE] Create an RDF-KV handler.
[DONE] Ship Intertwingler as a Docker image.
[DONE] Write a getting-started tutorial.
Other
Client wants to use Intertwingler as a back-end.
It turns out to that the default (AMD64) build is unusably slow in the client's development environment (Darwin/ARM64).
What about writing to the graph?
What if the client has trouble?

[DONE] Ship Intertwingler as a Docker image.

Has Broader
[DONE] Ship an Intertwingler instance that the client can use.
Has Narrower
[DONE] Generate an ARM64 Docker image.
[DONE] Make the Intertwingler home directory a Docker volume.
Other
About half a dozen dependencies have patches that have yet to be merged upstream.
Client is not a Ruby developer.
How do we get instance data in and out?
It turns out to that the default (AMD64) build is unusably slow in the client's development environment (Darwin/ARM64).
Shipping Intertwingler as a Docker image will afford putting off the open-source diplomacy of the upstream patches.

[DONE] Upgrade DigitalOcean server to handle the additional load introduced by Intertwingler.

Has Broader
Install Intertwingler on DigitalOcean VPS as a FastCGI daemon.
[DONE] Put Sense Atlas online so the client can observe its development.
Other
Intertwingler has been observed consuming up to a gigabyte of resident-set RAM during development.
It turns out the server at DigitalOcean is actually really slow.

[DONE] Write a getting-started tutorial.

Has Broader
[DONE] Ship an Intertwingler instance that the client can use.
Other
What if the client has trouble?

E

E-mail addresses are still going out in the clear.

Other
Get Intertwingler the hell online.
Implement e-mail scrambler transform.

Embedding Forget Passwords into Intertwingler is going to severely complicate (and compromise) the configuration for both systems, as well as the installation as a whole.

Other
Split Forget Passwords into app and middleware and wrap the latter around Intertwingler (on deuce).

Even ordinary page GETs when the graph state has not changed can be slow, particularly the first time a user loads a generated resource.

Other
Implement internal caching.
Sense Atlas is particularly slow after POSTs because the RDF store only has a single global modification time (that I had to hack in).

Extend Store::Digest to include a way to indicate that an entry ought to be treated as cache.

Has Broader
Implement caching of generated representations against Store::Digest.
Overhaul Store::Digest.
Has Narrower
Create a variant of the metadata driver for both the old and the new version of the database layout.
If an entity is inserted into the store overtop of a cache entry, clear the cache flag and ensure it is never re-enabled.
Reuse the `dtime` field and just set it to the future.
Other
Adding a cache field to Store::Digest will require changing the metadata record, which in turn will create an incompatibility.
How do we actually go about evicting expired cache entries?
It would be useful to be able to indicate to the engine that a given handler manages a content-addressable store (and thus can be used for caching).
We want Intertwingler to have a unified interface to opaque blob storage, whether a given blob is intended to be persisted or not.
What do we do about the expiration time of the cache entries?
What if a cache entry is superseded by an entity that is durable?

F

File a bug upstream to get proper support for FastCGI in Rackup.

Has Broader
Fish the FastCGI driver out of old Rack code and install it under /var/lib/intertwingler.
Other
The FastCGI daemon running Intertwingler keeps exiting; there is something not right with it.
There is still value in getting this working for Forget Passwords, which is stuck on Rack 2.x.

Find out if SHACL also uses grapheme clusters.

Has Broader
Try to create a mapping function between Lexicon and RDF.
Other
Lexicon appears to use Unicode grapheme clusters as the unit for text length.

Finish implementing addressable transforms.

Has Narrower
Implement internal caching.
Put every addressable transform in the queue and have the path parameters filter out and rearrange the remnants.
Other
Ordinary transforms are already attached to their respective queues. How do we handle transforms for addressable queues?
There are a number of scaled and transformed images on the site.

Fish the FastCGI driver out of old Rack code and install it under /var/lib/intertwingler.

Has Broader
Install Intertwingler on DigitalOcean VPS as a FastCGI daemon.
Has Narrower
File a bug upstream to get proper support for FastCGI in Rackup.
Other
The FastCGI daemon running Intertwingler keeps exiting; there is something not right with it.
The Rack maintainers have eliminated the FastCGI driver from Rack ≥3.0 (and thus Rackup) for being "antiquated".

Fix the C code in the Ruby LMDB bindings so that read-only transactions can "nest" (i.e., just no-op and use the parent transaction).

Has Broader
Change the LMDB driver on Store::Digest so the main entry table uses integer keys rather than the much larger "primary" hash.
Other
It turns out that there is a bug in the LMDB bindings that keeps nested transactions from working properly.

For now, just configure Intertwingler to power this one client's extranet.

Other
The way the extranets are configured, it would be a big pain to carve out an exception for just one of them.
While Intertwingler has been designed to drive multiple websites (authorities), it has only been tried with a single site, and there are invariably kinks in the implementation.

For some reason I made Store::Digest with all mixins that can't do inheritance.

Other
Create a variant of the metadata driver for both the old and the new version of the database layout.
Refactor Store::Digest::Meta and Store:Digest::Blob so that they are classes instead of mixins so they can be subclassed, and make the driver wrap them.

The FastCGI daemon running Intertwingler keeps exiting; there is something not right with it.

Other
File a bug upstream to get proper support for FastCGI in Rackup.
Fish the FastCGI driver out of old Rack code and install it under /var/lib/intertwingler.
[DONE] Install Intertwingler on deuce (which is ostensibly a much, much faster machine than that piddly little VPS on DigitalOcean) and just reverse-proxy it over the VPN.

G

Get in touch with the maintainers of the respective packages and get them to merge the patches.

Other
About half a dozen dependencies have patches that have yet to be merged upstream.

Get Intertwingler installable via `gem install` with no snags.

Get Intertwingler off deuce as soon as financially possible.

Other
I don't love having my dev server conscripted into production.
We can shell out for proper infrastructure when this thing starts making some money.

Get Intertwingler the hell online.

Has Broader
Ship a front end with interfaces for IBIS, PM, and a rudimentary FOAF/ORG editor.
[DONE] Put Sense Atlas online so the client can observe its development.
Has Narrower
Implement e-mail scrambler transform.
Install Intertwingler on DigitalOcean VPS as a FastCGI daemon.
Migrate App::IBIS content to Intertwingler/Sense Atlas.
Put site-specific data in a named graph that corresponds to the authority, then narrow queries to that named graph.
[DONE] Install Intertwingler on deuce (which is ostensibly a much, much faster machine than that piddly little VPS on DigitalOcean) and just reverse-proxy it over the VPN.
Other
E-mail addresses are still going out in the clear.
Get rudimentary multi-site support working on Intertwingler.
Intertwingler currently lacks any internal caching, so I don't trust it standing up to even the baseline thrash of the open internet.
The address space is already occupied by the old statically-generated sites and the old App::IBIS prototype.

Get rudimentary multi-site support working on Intertwingler.

Has Broader
Client wants to use the IBIS tool (Sense Atlas) in everyday operations.
Other
Get Intertwingler the hell online.
Ship a front end with interfaces for IBIS, PM, and a rudimentary FOAF/ORG editor.
[RESOLVED] As it stands, every site managed by Intertwingler can see the data from every other site.

Get Sense Atlas/Intertwingler running the client extranets, which are shielded from the thrash of the open internet.

Other
Intertwingler currently lacks any internal caching, so I don't trust it standing up to even the baseline thrash of the open internet.
Intertwingler is currently sharing the same RDF store for all sites under management.
Intertwingler needs proper functioning multi-site support anyway.
Where should Sense Atlas go?
While Intertwingler has been designed to drive multiple websites (authorities), it has only been tried with a single site, and there are invariably kinks in the implementation.

Google has, in fact, committed to eliminating support for XSLT.

Other
Create a server-side XSLT transform.
See Also

Granular modification times would be a byproduct of a full transaction history.

Other
Make it possible to query the RDF store for the modification times of specific resources and/or statements.

H

Have the resolver wrap its repository in an RDF::Graph.

Other
Having to pass the graph name into every query is going to be tedious and error-prone.
RDF::Graph slows down operations by two orders of magnitude!

Having to pass the graph name into every query is going to be tedious and error-prone.

Other
Have the resolver wrap its repository in an RDF::Graph.
Put site-specific data in a named graph that corresponds to the authority, then narrow queries to that named graph.

Hooking cache evictions to accessors is a simple solution.

Other
Put a hook on the store's accessors to evict the expired cache entries.

How about pay money for proper infrastructure?

Other
It turns out the server at DigitalOcean is actually really slow.
We can shell out for proper infrastructure when this thing starts making some money.

How are the parameters reconciled if you have multiples at once?

Other
[ADOPTED] Create a generic inventory resource that can take any combination of filtering parameters.
[ADOPTED] Treat the `instance-of`, `in-range-of`, `in-domain-of` parameters as a union, that is, select all resources that match any of them.

How are we going to handle cache entity bodies?

Other
Implement caching of generated representations against Store::Digest.
Implement internal caching.

How are we going to represent the metadata for the cache entries?

Has Narrower
How do we handle private cache entries?
Other
Create an ontology based on PROV-O and the HTTP message ontology that can represent the appropriate metadata.
Implement internal caching.
Index the cache entries using a dedicated key-value store.
Put the cache metadata into the RDF store.

How do we actually go about evicting expired cache entries?

Other
Create a background worker thread that polls regularly for expired cache entries.
Extend Store::Digest to include a way to indicate that an entry ought to be treated as cache.
Put a hook on the store's accessors to evict the expired cache entries.

How do we ensure that we return quickly and not hang up on the cache eviction process?

Other
Put a hook on the store's accessors to evict the expired cache entries.
Run the cache eviction process in another thread.

How do we get instance data in and out?

Other
[DONE] Make the Intertwingler home directory a Docker volume.
[DONE] Ship Intertwingler as a Docker image.

How do we handle authentication and access control?

Other
Leave Forget Passwords on the DO server, and hack both it and Intertwingler, respectively, to transmit and receive a JWT.
Split Forget Passwords into app and middleware and wrap the latter around Intertwingler (on deuce).
[DONE] Install Intertwingler on deuce (which is ostensibly a much, much faster machine than that piddly little VPS on DigitalOcean) and just reverse-proxy it over the VPN.
[DONE] Put Sense Atlas online so the client can observe its development.

How do we handle private cache entries?

Has Broader
How are we going to represent the metadata for the cache entries?
Other
Associate private cache entities with the principal that requested them.

How do we keep drafts and other works in progress from being exposed to the public?

Other
Implement a rudimentary mechanism for partitioning the graph into "public" and "private".
Put doriantaylor.com into Intertwingler.

How do we resolve RFC6920 content addresses to their HTTP(S) counterparts?

Has Broader
There are plenty of other durable URI schemes besides `urn:uuid:` and `ni:` (such as `ark:` or `doi:`).
Other
Overhaul the resolver so that it has pluggable transformations between durable identifiers and routable addresses.
We want Intertwingler to have a unified interface to opaque blob storage, whether a given blob is intended to be persisted or not.
See Also

How do you get the users into the system?

Other
A signed (and/or encrypted) JWT over an ordinary HTTP reverse proxy connection can perform the same function as REMOTE_USER within a FastCGI interface.

How do you make sure that you net resources that are subclasses of, or related by subproperties to, the supplied parameters?

Other
[ADOPTED] Create a generic inventory resource that can take any combination of filtering parameters.
[DONE] Perform RDFS/OWL inferencing against the supplied parameters and result set.

How do you safely store API credentials?

Other
Create an API bridge that will connect the two systems.

How does a client determine how to fill out the entire collection?

Other
Attach "next" and "previous" relations to the window resources, when applicable.
Represent windowed slices of a collection as ephemeral entities. Have them refer back to the main collection with the unmanageably many elements.
[DONE] Create an "index" resource that enumerates all the windowed resources.

How does the front-end know where the catalogues are?

Other
[DONE] Create a set of "catalogue resources".
[DONE] Create an index as a top-level entry point to the catalogues.
[DONE] Define a relation that connects the tool to the entry point.

How does the user assent to upgrading the metadata store?

Other
Detect what version the metadata layout is and whine at the user to upgrade it on their own time.
Once again, pass in an initialization parameter to explicitly upgrade the store.

How is the inventory going to be related to the index?

Other
[ADOPTED] Create a generic inventory resource that can take any combination of filtering parameters.

How will we construct the UI of the RDF-KV forms?

Has Narrower
[RESOLVED] Intertwingler currently does not have the resources to furnish the IBIS tool with fodder for autocompletes.
Other
[DONE] Create a set of "catalogue resources".
[DONE] Create an RDF-KV handler.

I

An inventory that takes alll these paramaters does much more than just display resources in the range of a given property.

Other
Create an "inventory" resource that is itself a list of resources with the characteristic of being in the range of a given property.
[ADOPTED] Create a generic inventory resource that can take any combination of filtering parameters.

I don't love having my dev server conscripted into production.

Other
Get Intertwingler off deuce as soon as financially possible.
[DONE] Install Intertwingler on deuce (which is ostensibly a much, much faster machine than that piddly little VPS on DigitalOcean) and just reverse-proxy it over the VPN.

I have fibre and get a 1.7ms ping time to the server, even over ipsec.

Other
[DONE] Install Intertwingler on deuce (which is ostensibly a much, much faster machine than that piddly little VPS on DigitalOcean) and just reverse-proxy it over the VPN.

I want TNoS to be effectively a structured discussion board around the book, with other data resources, running on Intertwingler.

Has Broader
I want to use Intertwingler to run my Web properties.
Other
Put natureof.software into Intertwingler.
See Also

I want to use Intertwingler to run my Web properties.

Has Narrower
I want TNoS to be effectively a structured discussion board around the book, with other data resources, running on Intertwingler.
Other
Put doriantaylor.com into Intertwingler.
Put intertwingler.net into Intertwingler (i.e., make it run its own website).
Put makethingsmakesense.com into Intertwingler.
Put methodandstructure.com into Intertwingler.
Put natureof.software into Intertwingler.
We will need to be able to handle a much higher request rate if we expose Intertwingler to the open internet.

If an entity is inserted into the store overtop of a cache entry, clear the cache flag and ensure it is never re-enabled.

Has Broader
Extend Store::Digest to include a way to indicate that an entry ought to be treated as cache.
Other
What if a cache entry is superseded by an entity that is durable?

If you have multiple processes accessing Store::Digest, this means multiple cleanup threads.

Other
Create a background worker thread that polls regularly for expired cache entries.

Implement a rudimentary mechanism for partitioning the graph into "public" and "private".

Has Broader
Put doriantaylor.com into Intertwingler.
Other
How do we keep drafts and other works in progress from being exposed to the public?
doriantaylor.com only needs a basic notion of "private" (only visible to me) and "public".

Implement caching of generated representations against Store::Digest.

Has Broader
Implement internal caching.
Put doriantaylor.com into Intertwingler.
Put vocab.methodandstructure.com into Intertwingler.
Has Narrower
Extend Store::Digest to include a way to indicate that an entry ought to be treated as cache.
Other
How are we going to handle cache entity bodies?
It would be useful to be able to indicate to the engine that a given handler manages a content-addressable store (and thus can be used for caching).

Implement e-mail scrambler transform.

Has Broader
Get Intertwingler the hell online.
Other
E-mail addresses are still going out in the clear.

Implement internal caching.

Has Broader
Finish implementing addressable transforms.
Has Narrower
Implement caching of generated representations against Store::Digest.
Other
Even ordinary page GETs when the graph state has not changed can be slow, particularly the first time a user loads a generated resource.
How are we going to handle cache entity bodies?
How are we going to represent the metadata for the cache entries?
Intertwingler currently lacks any internal caching, so I don't trust it standing up to even the baseline thrash of the open internet.
It is possible for two different requests to produce the same cache entity.
We were relying on browser cache to keep XSLT from being intolerably slow.
We will need to be able to handle a much higher request rate if we expose Intertwingler to the open internet.

Implement provisional JSON-LD variants for large resources (notably the catalogues) accessed through client-side scripting.

Other
`RDF::Vocabulary.find` is a major culprit. It is used in `Intertwingler::Resolver#coerce_resource` and does a sequential scan over hundreds of vocabs every time a lookup is done.

Index the cache entries using a dedicated key-value store.

Other
A dedicated cache metadata store will be much more efficient in many ways than trying to use RDF.
How are we going to represent the metadata for the cache entries?
It is unlikely that anything actually needs to interface with the cache metadata besides the cache engine itself.

Indiscriminately loading transform handlers is a much simpler design.

Other
Load all tfo:Function entities in the graph indiscriminately.

Install Intertwingler on DigitalOcean VPS as a FastCGI daemon.

Has Broader
Get Intertwingler the hell online.
Has Narrower
Fish the FastCGI driver out of old Rack code and install it under /var/lib/intertwingler.
[DONE] Upgrade DigitalOcean server to handle the additional load introduced by Intertwingler.
Other
The Rack maintainers have eliminated the FastCGI driver from Rack ≥3.0 (and thus Rackup) for being "antiquated".

Intertwingler currently lacks any internal caching, so I don't trust it standing up to even the baseline thrash of the open internet.

Other
Get Intertwingler the hell online.
Get Sense Atlas/Intertwingler running the client extranets, which are shielded from the thrash of the open internet.
Implement internal caching.
We will need to be able to handle a much higher request rate if we expose Intertwingler to the open internet.
[DONE] Put Sense Atlas on its own website, senseatlas.net.

Intertwingler has been observed consuming up to a gigabyte of resident-set RAM during development.

Other
Separate RDF stores are the second most reliable way to keep data from leaking across sites under management, second to separate Intertwingler instances altogether.
There are almost certainly memory leaks in and around Intertwingler and its snarl of dependencies.
[DONE] Upgrade DigitalOcean server to handle the additional load introduced by Intertwingler.

Intertwingler is currently sharing the same RDF store for all sites under management.

Other
Arrange for each site under management to have its own RDF store.
Get Sense Atlas/Intertwingler running the client extranets, which are shielded from the thrash of the open internet.
[RESOLVED] As it stands, every site managed by Intertwingler can see the data from every other site.

Intertwingler needs proper functioning multi-site support anyway.

Other
Get Sense Atlas/Intertwingler running the client extranets, which are shielded from the thrash of the open internet.
While Intertwingler has been designed to drive multiple websites (authorities), it has only been tried with a single site, and there are invariably kinks in the implementation.

It is possible for two different requests to produce the same cache entity.

Other
Can the resource identified by the hash URI be the cache entry?
Implement internal caching.

It is unlikely that anything actually needs to interface with the cache metadata besides the cache engine itself.

Other
Index the cache entries using a dedicated key-value store.
Put the cache metadata into the RDF store.

It turns out that there is a bug in the LMDB bindings that keeps nested transactions from working properly.

Other
Change the LMDB driver on Store::Digest so the main entry table uses integer keys rather than the much larger "primary" hash.
Fix the C code in the Ruby LMDB bindings so that read-only transactions can "nest" (i.e., just no-op and use the parent transaction).

It turns out the server at DigitalOcean is actually really slow.

Other
How about pay money for proper infrastructure?
[DONE] Install Intertwingler on deuce (which is ostensibly a much, much faster machine than that piddly little VPS on DigitalOcean) and just reverse-proxy it over the VPN.
[DONE] Upgrade DigitalOcean server to handle the additional load introduced by Intertwingler.

It turns out to that the default (AMD64) build is unusably slow in the client's development environment (Darwin/ARM64).

Other
[DONE] Generate an ARM64 Docker image.
[DONE] Ship Intertwingler as a Docker image.
[DONE] Ship an Intertwingler instance that the client can use.

It will probably be easier to test these things anyway if they are classes instead of mixins.

Other
Refactor Store::Digest::Meta and Store:Digest::Blob so that they are classes instead of mixins so they can be subclassed, and make the driver wrap them.

It would be useful to be able to indicate to the engine that a given handler manages a content-addressable store (and thus can be used for caching).

Other
Extend Store::Digest to include a way to indicate that an entry ought to be treated as cache.
Implement caching of generated representations against Store::Digest.

L

Leave Forget Passwords on the DO server, and hack both it and Intertwingler, respectively, to transmit and receive a JWT.

Has Broader
[DONE] Put Sense Atlas online so the client can observe its development.
Other
A signed (and/or encrypted) JWT over an ordinary HTTP reverse proxy connection can perform the same function as REMOTE_USER within a FastCGI interface.
How do we handle authentication and access control?

Let's keep makethingsmakesense.com and just redirect it for now.

Other
Do we actually want to keep this domain alive?

Lexicon appears to use Unicode grapheme clusters as the unit for text length.

Other
Find out if SHACL also uses grapheme clusters.
Try to create a mapping function between Lexicon and RDF.

Load all tfo:Function entities in the graph indiscriminately.

Other
Indiscriminately loading transform handlers is a much simpler design.
Ordinary transforms are already attached to their respective queues. How do we handle transforms for addressable queues?
There already exists considerable infrastructure for selectively loading transform handlers.
There is a concern of memory consumption of unused handlers.

M

Make an addressable transform of the form `;backlinks=property,window-start,window-end` —one for each property (that runs long).

Has Broader
Design a mechanism for offloading and/or paginating backlinks.
Other
There is no reason why an addressable transform can't completely replace a response body, rather than just manipulate it.

Make Intertwingler an ATProto PDS.

Other
ATProto speaks XRPC (over WebSockets) while Intertwingler is very heavily REST.
We don't even know if Lexicon is isomorphic to (or even embeds into) RDF.

Make it possible to query the RDF store for the modification times of specific resources and/or statements.

Other
Granular modification times would be a byproduct of a full transaction history.
Sense Atlas is particularly slow after POSTs because the RDF store only has a single global modification time (that I had to hack in).

Migrate App::IBIS content to Intertwingler/Sense Atlas.

Has Broader
Get Intertwingler the hell online.
Other
The address space is already occupied by the old statically-generated sites and the old App::IBIS prototype.

Mixing cache and non-cache entities in the `dtime` index will make it necessary to parse and scan each entry when it comes time to expire cache entries.

Other
Create an `etime` index for cache expiry; only put actual tombstones in the `dtime` index, and only put cache entries in `etime`.
Reuse the `dtime` field and just set it to the future.

Modify the transform that inserts XSLT processing instructions to match against asserted types (or specific resources).

Has Broader
Use the import precedence native to XSLT.

O

Once again, pass in an initialization parameter to explicitly upgrade the store.

Has Broader
Detect what version the metadata layout is and whine at the user to upgrade it on their own time.
Other
How does the user assent to upgrading the metadata store?

Ordinary transforms are already attached to their respective queues. How do we handle transforms for addressable queues?

Other
Finish implementing addressable transforms.
Load all tfo:Function entities in the graph indiscriminately.
Put every addressable transform in the queue and have the path parameters filter out and rearrange the remnants.

Overhaul Store::Digest.

Has Narrower
Change the LMDB driver on Store::Digest so the main entry table uses integer keys rather than the much larger "primary" hash.
Create a mechanism that upgrades old metadata database layouts to the new one.
Extend Store::Digest to include a way to indicate that an entry ought to be treated as cache.

Overhaul the resolver so that it has pluggable transformations between durable identifiers and routable addresses.

Has Narrower
Create a configurable "pecking order" of URI schemes so the precedence is unambiguous.
Other
How do we resolve RFC6920 content addresses to their HTTP(S) counterparts?
What happens if a resource has more than one durable identifier?

The original inventory resource in the prototype goes by instance of class rather than range.

Other
[DONE] Add an `instance-of` parameter that takes class identifiers for parity with the original resource.
[DONE] Put the `in-range-of` property into parameter.

P

Paginate the result set.

Has Broader
[ADOPTED] Create a generic inventory resource that can take any combination of filtering parameters.
Has Narrower
Represent windowed slices of a collection as ephemeral entities. Have them refer back to the main collection with the unmanageably many elements.
Other
What if the result set is enormous?

Partitioning site-specific data by named graph also messes with the engine/resolver/handler/transform configuration.

Other
Put site-specific data in a named graph that corresponds to the authority, then narrow queries to that named graph.

Perhaps we should be using authentication groups instead of principals for cache entities.

Other
What about private cache entities that also happen to match public responses?

Plan projects and resources with the use of the Process Model Ontology.

Has Broader
Client wants to use the IBIS tool (Sense Atlas) in everyday operations.
Other
Ship a front end with interfaces for IBIS, PM, and a rudimentary FOAF/ORG editor.
See Also

principal

Principal is the name given to an identity for authentication purposes. A principal is the thing that makes contact with an authenticated information system. As such, it can represent an actual human, or an agent of that human (either another human or an automated system), or any other entity that requires access.

Properties go in the other direction too, so it would be useful to be able to pick subjects of properties as well as objects.

Other
[DONE] Make the inventory resource respond to an `in-domain-of` parameter for symmetry with `in-range-of`.
[DONE] Put the `in-range-of` property into parameter.

Put a hook on the store's accessors to evict the expired cache entries.

Has Broader
Run the cache eviction process in another thread.
Other
Hooking cache evictions to accessors is a simple solution.
How do we actually go about evicting expired cache entries?
How do we ensure that we return quickly and not hang up on the cache eviction process?
The state of the cache is contingent on ongoing interactions with the store.

Put doriantaylor.com into Intertwingler.

Has Narrower
Implement a rudimentary mechanism for partitioning the graph into "public" and "private".
Implement caching of generated representations against Store::Digest.
Other
How do we keep drafts and other works in progress from being exposed to the public?
I want to use Intertwingler to run my Web properties.
There are a number of scaled and transformed images on the site.
We will need to be able to handle a much higher request rate if we expose Intertwingler to the open internet.

Put every addressable transform in the queue and have the path parameters filter out and rearrange the remnants.

Has Broader
Finish implementing addressable transforms.
Other
Ordinary transforms are already attached to their respective queues. How do we handle transforms for addressable queues?
There already exists considerable infrastructure for selectively loading transform handlers.

Put intertwingler.net into Intertwingler (i.e., make it run its own website).

Other
I want to use Intertwingler to run my Web properties.

Put makethingsmakesense.com into Intertwingler.

Has Narrower
[DONE] Port the IBIS tool to Intertwingler.
Other
Do we actually want to keep this domain alive?
I want to use Intertwingler to run my Web properties.

Put methodandstructure.com into Intertwingler.

Has Narrower
Alias privatealpha.com to methodandstructure.com.
Put vocab.methodandstructure.com into Intertwingler.
[DONE] Put client extranets into Intertwingler.
Other
I want to use Intertwingler to run my Web properties.

Put natureof.software into Intertwingler.

Other
I want TNoS to be effectively a structured discussion board around the book, with other data resources, running on Intertwingler.
I want to use Intertwingler to run my Web properties.
There are a number of scaled and transformed images on the site.

Put site-specific data in a named graph that corresponds to the authority, then narrow queries to that named graph.

Has Broader
Get Intertwingler the hell online.
Has Narrower
[DONE] Clean up slow RDF::Graph methods and push upstream.
Other
Having to pass the graph name into every query is going to be tedious and error-prone.
Partitioning site-specific data by named graph also messes with the engine/resolver/handler/transform configuration.
This strategy may prevent us from using named graphs for something different later on.
[RESOLVED] As it stands, every site managed by Intertwingler can see the data from every other site.

Put the cache metadata into the RDF store.

Has Narrower
Create an ontology based on PROV-O and the HTTP message ontology that can represent the appropriate metadata.
Other
Do we actually want/need really busy, chatty cache metadata in the RDF store?
How are we going to represent the metadata for the cache entries?
It is unlikely that anything actually needs to interface with the cache metadata besides the cache engine itself.

Put vocab.methodandstructure.com into Intertwingler.

Has Broader
Put methodandstructure.com into Intertwingler.
Has Narrower
Implement caching of generated representations against Store::Digest.
Other
We will need to be able to handle a much higher request rate if we expose Intertwingler to the open internet.

The plan was always to turn Forget Passwords into proper Rack middleware.

Other
Split Forget Passwords into app and middleware and wrap the latter around Intertwingler (on deuce).

R

A REST-to-WebSockets adapter will make it easy to write handlers (microservices).

Other
Create an adapter that will map between ordinary HTTP resources and WebSockets.

RDF::Graph slows down operations by two orders of magnitude!

Other
Bust out the profiler and fix whatever's causing RDF::Graph to run (INSANELY) slow.
Have the resolver wrap its repository in an RDF::Graph.
[DONE] Clean up slow RDF::Graph methods and push upstream.

Refactor Store::Digest::Meta and Store:Digest::Blob so that they are classes instead of mixins so they can be subclassed, and make the driver wrap them.

Has Broader
Create a variant of the metadata driver for both the old and the new version of the database layout.
Other
For some reason I made Store::Digest with all mixins that can't do inheritance.
It will probably be easier to test these things anyway if they are classes instead of mixins.
Turning these Store::Digest modules into classes may not actually be necessary.

Represent windowed slices of a collection as ephemeral entities. Have them refer back to the main collection with the unmanageably many elements.

Has Broader
Paginate the result set.
[DONE] Create a set of "catalogue resources".
Other
How does a client determine how to fill out the entire collection?
What happens when somebody tries to directly access the main collection?
What happens when you have an extreme quantity of entities for a given class and/or in the domain or range of a given predicate?

Reuse the `dtime` field and just set it to the future.

Has Broader
Extend Store::Digest to include a way to indicate that an entry ought to be treated as cache.
Has Narrower
Create an `etime` index for cache expiry; only put actual tombstones in the `dtime` index, and only put cache entries in `etime`.
Other
Mixing cache and non-cache entities in the `dtime` index will make it necessary to parse and scan each entry when it comes time to expire cache entries.
What do we do about the expiration time of the cache entries?

Rough out the new classes and try loading them with instance data; take notes for what can be improved.

Has Narrower
[DONE] Put Sense Atlas online so the client can observe its development.

Run the cache eviction process in another thread.

Has Narrower
Create a background worker thread that polls regularly for expired cache entries.
Put a hook on the store's accessors to evict the expired cache entries.
Other
How do we ensure that we return quickly and not hang up on the cache eviction process?

The Rack maintainers have eliminated the FastCGI driver from Rack ≥3.0 (and thus Rackup) for being "antiquated".

Other
Fish the FastCGI driver out of old Rack code and install it under /var/lib/intertwingler.
Install Intertwingler on DigitalOcean VPS as a FastCGI daemon.

[RESOLVED] As it stands, every site managed by Intertwingler can see the data from every other site.

Other
Arrange for each site under management to have its own RDF store.
Get rudimentary multi-site support working on Intertwingler.
Intertwingler is currently sharing the same RDF store for all sites under management.
Put site-specific data in a named graph that corresponds to the authority, then narrow queries to that named graph.

[RESOLVED] Intertwingler currently does not have the resources to furnish the IBIS tool with fodder for autocompletes.

Has Broader
How will we construct the UI of the RDF-KV forms?
Other
[DONE] Create a set of "catalogue resources".
[DONE] Port the IBIS tool to Intertwingler.

[RESOLVED] The front-end code to the IBIS tool is currently in the back-end repository.

Other
[DONE] Decouple the IBIS front-end from its back-end (into its own Git[hub] repository).
[DONE] Port the IBIS tool to Intertwingler.

`RDF::Vocabulary.find` is a major culprit. It is used in `Intertwingler::Resolver#coerce_resource` and does a sequential scan over hundreds of vocabs every time a lookup is done.

Other
Implement provisional JSON-LD variants for large resources (notably the catalogues) accessed through client-side scripting.

S

A separate RDF store per site will make a big mess with Intertwingler's configuration data.

Other
Arrange for each site under management to have its own RDF store.

A signed (and/or encrypted) JWT over an ordinary HTTP reverse proxy connection can perform the same function as REMOTE_USER within a FastCGI interface.

Other
How do you get the users into the system?
Leave Forget Passwords on the DO server, and hack both it and Intertwingler, respectively, to transmit and receive a JWT.

Sense Atlas is particularly slow after POSTs because the RDF store only has a single global modification time (that I had to hack in).

Other
Even ordinary page GETs when the graph state has not changed can be slow, particularly the first time a user loads a generated resource.
Make it possible to query the RDF store for the modification times of specific resources and/or statements.

Separate RDF stores are the second most reliable way to keep data from leaking across sites under management, second to separate Intertwingler instances altogether.

Other
Arrange for each site under management to have its own RDF store.
Intertwingler has been observed consuming up to a gigabyte of resident-set RAM during development.

Separate stores will consume considerably more RAM and drive space.

Other
Arrange for each site under management to have its own RDF store.

Ship a front end with interfaces for IBIS, PM, and a rudimentary FOAF/ORG editor.

Has Broader
Ship a front end with interfaces for IBIS, PM, and a rudimentary FOAF/ORG editor.
Has Narrower
Get Intertwingler the hell online.
Ship a front end with interfaces for IBIS, PM, and a rudimentary FOAF/ORG editor.
Other
Client wants to use the IBIS tool (Sense Atlas) in everyday operations.
Get rudimentary multi-site support working on Intertwingler.
Plan projects and resources with the use of the Process Model Ontology.
The XSLT component kind of sucks.

Shipping Intertwingler as a Docker image will afford putting off the open-source diplomacy of the upstream patches.

Other
[DONE] Ship Intertwingler as a Docker image.

Should the user be able to silence the whining?

Other
Detect what version the metadata layout is and whine at the user to upgrade it on their own time.
Yes. Supply an initialization parameter to ignore the whining.

Split Forget Passwords into app and middleware and wrap the latter around Intertwingler (on deuce).

Other
Embedding Forget Passwords into Intertwingler is going to severely complicate (and compromise) the configuration for both systems, as well as the installation as a whole.
How do we handle authentication and access control?
The plan was always to turn Forget Passwords into proper Rack middleware.

Store::Digest uses a baroque and inefficient layout for its metadata.

Other
Change the LMDB driver on Store::Digest so the main entry table uses integer keys rather than the much larger "primary" hash.

The state of the cache is contingent on ongoing interactions with the store.

Other
Put a hook on the store's accessors to evict the expired cache entries.

T

There already exists considerable infrastructure for selectively loading transform handlers.

Other
Load all tfo:Function entities in the graph indiscriminately.
Put every addressable transform in the queue and have the path parameters filter out and rearrange the remnants.

There are a number of scaled and transformed images on the site.

Other
Finish implementing addressable transforms.
Put doriantaylor.com into Intertwingler.
Put natureof.software into Intertwingler.

There are all manners of properties, so hard-coding a single one is silly.

Other
Create an "inventory" resource that is itself a list of resources with the characteristic of being in the range of a given property.
[DONE] Put the `in-range-of` property into parameter.

There are almost certainly memory leaks in and around Intertwingler and its snarl of dependencies.

Other
Dedicate some time to hunting down and squashing memory leaks and general wasteful memory usage.
Intertwingler has been observed consuming up to a gigabyte of resident-set RAM during development.

There are plenty of other durable URI schemes besides `urn:uuid:` and `ni:` (such as `ark:` or `doi:`).

Has Narrower
How do we resolve RFC6920 content addresses to their HTTP(S) counterparts?

There is a concern of memory consumption of unused handlers.

Other
Load all tfo:Function entities in the graph indiscriminately.

There is no reason why an addressable transform can't completely replace a response body, rather than just manipulate it.

Other
Make an addressable transform of the form `;backlinks=property,window-start,window-end` —one for each property (that runs long).

There is still value in getting this working for Forget Passwords, which is stuck on Rack 2.x.

Other
File a bug upstream to get proper support for FastCGI in Rackup.
See Also

This strategy may prevent us from using named graphs for something different later on.

Other
Put site-specific data in a named graph that corresponds to the authority, then narrow queries to that named graph.

To populate potential linkages, the tool needs a list of all resources that are in the range of a given property.

Other
Create an "inventory" resource that is itself a list of resources with the characteristic of being in the range of a given property.
[ADOPTED] Create a generic inventory resource that can take any combination of filtering parameters.
[DONE] Create an index as a top-level entry point to the catalogues.

Try messing around with SaxonJS?

Other
Despite being standard, reliable, and fast to execute, and built into every browser, XSLT 1.0 in particular is really clunky to work with.
See Also

Try to create a mapping function between Lexicon and RDF.

Has Narrower
Find out if SHACL also uses grapheme clusters.
Other
Lexicon appears to use Unicode grapheme clusters as the unit for text length.
We don't even know if Lexicon is isomorphic to (or even embeds into) RDF.
See Also

Turning these Store::Digest modules into classes may not actually be necessary.

Other
Refactor Store::Digest::Meta and Store:Digest::Blob so that they are classes instead of mixins so they can be subclassed, and make the driver wrap them.

U

Use the import precedence native to XSLT.

Has Narrower
Modify the transform that inserts XSLT processing instructions to match against asserted types (or specific resources).
Other
The way the template "inheritance" currently works is extremely messy.

W

The way the extranets are configured, it would be a big pain to carve out an exception for just one of them.

Other
For now, just configure Intertwingler to power this one client's extranet.

The way the template "inheritance" currently works is extremely messy.

Other
The XSLT component kind of sucks.
Use the import precedence native to XSLT.

We can shell out for proper infrastructure when this thing starts making some money.

Other
Get Intertwingler off deuce as soon as financially possible.
How about pay money for proper infrastructure?
[DONE] Install Intertwingler on deuce (which is ostensibly a much, much faster machine than that piddly little VPS on DigitalOcean) and just reverse-proxy it over the VPN.

We don't even know if Lexicon is isomorphic to (or even embeds into) RDF.

Other
Make Intertwingler an ATProto PDS.
Try to create a mapping function between Lexicon and RDF.

We don't really have time to write migration code and we're not sure how many there are out there in the wild.

Other
Create a mechanism that upgrades old metadata database layouts to the new one.

We don't want to take people by surprise by forcing them into upgrading their data store.

Other
Create a mechanism that upgrades old metadata database layouts to the new one.
Detect what version the metadata layout is and whine at the user to upgrade it on their own time.

We have a good relationship with the Ruby-RDF maintainer, who very promptly merges and pushes out our patches.

Other
Bust out the profiler and fix whatever's causing RDF::Graph to run (INSANELY) slow.

We want Intertwingler to have a unified interface to opaque blob storage, whether a given blob is intended to be persisted or not.

Other
Extend Store::Digest to include a way to indicate that an entry ought to be treated as cache.
How do we resolve RFC6920 content addresses to their HTTP(S) counterparts?

We were relying on browser cache to keep XSLT from being intolerably slow.

Other
Create a server-side XSLT transform.
Implement internal caching.

We will need to be able to handle a much higher request rate if we expose Intertwingler to the open internet.

Other
I want to use Intertwingler to run my Web properties.
Implement internal caching.
Intertwingler currently lacks any internal caching, so I don't trust it standing up to even the baseline thrash of the open internet.
Put doriantaylor.com into Intertwingler.
Put vocab.methodandstructure.com into Intertwingler.

We would actually be able to represent real sequences of actions instead of just suggestions about what to do.

Other
[DONE] Port the IBIS tool to Intertwingler.

We would actually be able to unblock development on the IBIS tool that has been sitting idle for over a decade.

Other
[DONE] Port the IBIS tool to Intertwingler.

What about changes that happen on GitHub?

Other
Create an API bridge that will connect the two systems.

What about private cache entities that also happen to match public responses?

Other
Associate private cache entities with the principal that requested them.
Associate the cache entity with a null principal to indicate that it is actually public.
Perhaps we should be using authentication groups instead of principals for cache entities.

What about writing to the graph?

Other
[DONE] Create an RDF-KV handler.
[DONE] Ship an Intertwingler instance that the client can use.

What do we do about the expiration time of the cache entries?

Other
Extend Store::Digest to include a way to indicate that an entry ought to be treated as cache.
Reuse the `dtime` field and just set it to the future.

What happens if a resource has more than one durable identifier?

Other
Create a configurable "pecking order" of URI schemes so the precedence is unambiguous.
Overhaul the resolver so that it has pluggable transformations between durable identifiers and routable addresses.

What happens when somebody tries to directly access the main collection?

Has Broader
What happens when you have an extreme quantity of entities for a given class and/or in the domain or range of a given predicate?
Other
Represent windowed slices of a collection as ephemeral entities. Have them refer back to the main collection with the unmanageably many elements.
When somebody tries to access the main collection, redirect them to the first window onto the collection.

What happens when you have an extreme quantity of entities for a given class and/or in the domain or range of a given predicate?

Has Narrower
What happens when somebody tries to directly access the main collection?
Other
Represent windowed slices of a collection as ephemeral entities. Have them refer back to the main collection with the unmanageably many elements.
[DONE] Create a set of "catalogue resources".

What if a cache entry is superseded by an entity that is durable?

Other
Extend Store::Digest to include a way to indicate that an entry ought to be treated as cache.
If an entity is inserted into the store overtop of a cache entry, clear the cache flag and ensure it is never re-enabled.

What if the client has trouble?

Other
[DONE] Create a Git repository with a prefabricated configuration.
[DONE] Ship an Intertwingler instance that the client can use.
[DONE] Write a getting-started tutorial.

What if the result set is enormous?

Other
Paginate the result set.
[ADOPTED] Create a generic inventory resource that can take any combination of filtering parameters.

What if we only want resources that match assertions with no inferencing?

Other
[DONE] Create an `inferred` flag that is on by default but can be set to false that will disable inferencing.
[DONE] Perform RDFS/OWL inferencing against the supplied parameters and result set.

What if you wanted only inferred results with no asserted ones?

Other
[DONE] Create an `asserted` flag that is on by default but can be set to false to eliminate asserted resources from the result set.
[DONE] Create an `inferred` flag that is on by default but can be set to false that will disable inferencing.

When somebody tries to access the main collection, redirect them to the first window onto the collection.

Has Broader
[DONE] Create a set of "catalogue resources".
Other
What happens when somebody tries to directly access the main collection?

Where should Sense Atlas go?

Other
Get Sense Atlas/Intertwingler running the client extranets, which are shielded from the thrash of the open internet.
[DONE] Put Sense Atlas on its own website, senseatlas.net.
[DONE] Put Sense Atlas online so the client can observe its development.

While Intertwingler has been designed to drive multiple websites (authorities), it has only been tried with a single site, and there are invariably kinks in the implementation.

Other
For now, just configure Intertwingler to power this one client's extranet.
Get Sense Atlas/Intertwingler running the client extranets, which are shielded from the thrash of the open internet.
Intertwingler needs proper functioning multi-site support anyway.

X

The XSLT component kind of sucks.

Other
Design a compact XSLT syntax and concomitant transpiler.
Despite being standard, reliable, and fast to execute, and built into every browser, XSLT 1.0 in particular is really clunky to work with.
Ship a front end with interfaces for IBIS, PM, and a rudimentary FOAF/ORG editor.
The way the template "inheritance" currently works is extremely messy.

XSLT (any version) would be a million times easier to work with if it was just easier to type.

Other
Design a compact XSLT syntax and concomitant transpiler.

Y

Yes. Supply an initialization parameter to ignore the whining.

Has Broader
Detect what version the metadata layout is and whine at the user to upgrade it on their own time.
Other
Should the user be able to silence the whining?