- (oops that was dumb, ignore this)
- A separate RDF store per site will make a big mess with Intertwingler's configuration data.
- A signed (and/or encrypted) JWT over an ordinary HTTP reverse proxy connection can perform the same function as REMOTE_USER within a FastCGI interface.
- About half a dozen dependencies have patches that have yet to be merged upstream.
- Alias privatealpha.com to methodandstructure.com.
- An inventory that takes alll these paramaters does much more than just display resources in the range of a given property.
- Arrange for each site under management to have its own RDF store.
- Attach "next" and "previous" relations to the window resources, when applicable.
- Bust out the profiler and fix whatever's causing RDF::Graph to run (INSANELY) slow.
- Client is not a Ruby developer.
- Client wants to map IBIS issues to GitHub issues.
- Client wants to use Intertwingler as a back-end.
- Client wants to use the IBIS tool (Sense Atlas) in everyday operations.
- Create an "inventory" resource that is itself a list of resources with the characteristic of being in the range of a given property.
- Create an API bridge that will connect the two systems.
- Dedicate some time to hunting down and squashing memory leaks and general wasteful memory usage.
- Design a compact XSLT syntax and concomitant transpiler.
- Despite being standard, reliable, and fast to execute, and built into every browser, XSLT 1.0 in particular is really clunky to work with.
- Do we actually want to keep this domain alive?
- E-mail addresses are still going out in the clear.
- Embedding Forget Passwords into Intertwingler is going to severely complicate (and compromise) the configuration for both systems, as well as the installation as a whole.
- Extend Store::Digest to include a way to indicate that an entry ought to be treated as cache.
- File a bug upstream to get proper support for FastCGI in Rackup.
- Finish implementing addressable transforms.
- Fish the FastCGI driver out of old Rack code and install it under /var/lib/intertwingler.
- For now, just configure Intertwingler to power this one client's extranet.
- Get Intertwingler off deuce as soon as financially possible.
- Get Intertwingler the hell online.
- Get Sense Atlas/Intertwingler running the client extranets, which are shielded from the thrash of the open internet.
- Get in touch with the maintainers of the respective packages and get them to merge the patches.
- Get rudimentary multi-site support working on Intertwingler.
- Have the resolver wrap its repository in an RDF::Graph.
- Having to pass the graph name into every query is going to be tedious and error-prone.
- How about pay money for proper infrastructure?
- How are the parameters reconciled if you have multiples at once?
- How do we get instance data in and out?
- How do we handle authentication and access control?
- How do we keep drafts and other works in progress from being exposed to the public?
- How do you get the users into the system?
- How do you make sure that you net resources that are subclasses of, or related by subproperties to, the supplied parameters?
- How do you safely store API credentials?
- How does a client determine how to fill out the entire collection?
- How does the front-end know where the catalogues are?
- How is the inventory going to be related to the index?
- How will we construct the UI of the RDF-KV forms?
- I don't love having my dev server conscripted into production.
- I have fibre and get a 1.7ms ping time to the server, even over ipsec.
- I want to use Intertwingler to run my Web properties.
- If an entity is inserted into the store overtop of a cache entry, clear the cache flag and ensure it is never re-enabled.
- Implement a rudimentary mechanism for partitioning the graph into "public" and "private".
- Implement caching of generated representations against Store::Digest.
- Implement e-mail scrambler transform.
- Implement internal caching.
- Implement provisional JSON-LD variants for large resources (notably the catalogues) accessed through client-side scripting.
- Indiscriminately loading transform handlers is a much simpler design.
- Install Intertwingler on DigitalOcean VPS as a FastCGI daemon.
- Intertwingler currently lacks any internal caching, so I don't trust it standing up to even the baseline thrash of the open internet.
- Intertwingler has been observed consuming up to a gigabyte of resident-set RAM during development.
- Intertwingler is currently sharing the same RDF store for all sites under management.
- Intertwingler needs proper functioning multi-site support anyway.
- It turns out the server at DigitalOcean is actually really slow.
- It turns out to that the default (AMD64) build is unusably slow in the client's development environment (Darwin/ARM64).
- It would be useful to be able to indicate to the engine that a given handler manages a content-addressable store (and thus can be used for caching).
- Leave Forget Passwords on the DO server, and hack both it and Intertwingler, respectively, to transmit and receive a JWT.
- Load all tfo:Function entities in the graph indiscriminately.
- Migrate App::IBIS content to Intertwingler/Sense Atlas.
- Modify the transform that inserts XSLT processing instructions to match against asserted types (or specific resources).
- Ordinary transforms are already attached to their respective queues. How do we handle transforms for addressable queues?
- Paginate the result set.
- Partitioning site-specific data by named graph also messes with the engine/resolver/handler/transform configuration.
- Plan projects and resources with the use of the Process Model Ontology.
- Properties go in the other direction too, so it would be useful to be able to pick subjects of properties as well as objects.
- Put doriantaylor.com into Intertwingler.
- Put every addressable transform in the queue and have the path parameters filter out and rearrange the remnants.
- Put intertwingler.net into Intertwingler (i.e., make it run its own website).
- Put makethingsmakesense.com into Intertwingler.
- Put methodandstructure.com into Intertwingler.
- Put natureof.software into Intertwingler.
- Put site-specific data in a named graph that corresponds to the authority, then narrow queries to that named graph.
- Put vocab.methodandstructure.com into Intertwingler.
- RDF::Graph slows down operations by two orders of magnitude!
- Represent windowed slices of a collection as ephemeral entities. Have them refer back to the main collection with the unmanageably many elements.
- Rough out the new classes and try loading them with instance data; take notes for what can be improved.
- Separate RDF stores are the second most reliable way to keep data from leaking across sites under management, second to separate Intertwingler instances altogether.
- Separate stores will consume considerably more RAM and drive space.
- Ship a front end with interfaces for IBIS, PM, and a rudimentary FOAF/ORG editor.
- Shipping Intertwingler as a Docker image will afford putting off the open-source diplomacy of the upstream patches.
- Split Forget Passwords into app and middleware and wrap the latter around Intertwingler (on deuce).
- The FastCGI daemon running Intertwingler keeps exiting; there is something not right with it.
- The Rack maintainers have eliminated the FastCGI driver from Rack ≥3.0 (and thus Rackup) for being "antiquated".
- The XSLT component kind of sucks.
- The address space is already occupied by the old statically-generated sites and the old App::IBIS prototype.
- The original inventory resource in the prototype goes by instance of class rather than range.
- The plan was always to turn Forget Passwords into proper Rack middleware.
- The way the extranets are configured, it would be a big pain to carve out an exception for just one of them.
- The way the template "inheritance" currently works is extremely messy.
- There already exists considerable infrastructure for selectively loading transform handlers.
- There are a number of scaled and transformed images on the site.
- There are all manners of properties, so hard-coding a single one is silly.
- There are almost certainly memory leaks in and around Intertwingler and its snarl of dependencies.
- There is a concern of memory consumption of unused handlers.
- There is still value in getting this working for Forget Passwords, which is stuck on Rack 2.x.
- This strategy may prevent us from using named graphs for something different later on.
- To populate potential linkages, the tool needs a list of all resources that are in the range of a given property.
- Try messing around with SaxonJS?
- Use the import precedence native to XSLT.
- We can shell out for proper infrastructure when this thing starts making some money.
- We have a good relationship with the Ruby-RDF maintainer, who very promptly merges and pushes out our patches.
- We will need to be able to handle a much higher request rate if we expose Intertwingler to the open internet.
- We would actually be able to represent real sequences of actions instead of just suggestions about what to do.
- We would actually be able to unblock development on the IBIS tool that has been sitting idle for over a decade.
- What about changes that happen on GitHub?
- What about writing to the graph?
- What happens when somebody tries to directly access the main collection?
- What happens when you have an extreme quantity of entities for a given class and/or in the domain or range of a given predicate?
- What if a cache entry is superseded by an entity that is durable?
- What if the client has trouble?
- What if the result set is enormous?
- What if we only want resources that match assertions with no inferencing?
- What if you wanted only inferred results with no asserted ones?
- When somebody tries to access the main collection, redirect them to the first window onto the collection.
- Where should Sense Atlas go?
- While Intertwingler has been designed to drive multiple websites (authorities), it has only been tried with a single site, and there are invariably kinks in the implementation.
- XSLT (any version) would be a million times easier to work with if it was just easier to type.
- [ADOPTED] Create a generic inventory resource that can take any combination of filtering parameters.
- [ADOPTED] Treat the `instance-of`, `in-range-of`, `in-domain-of` parameters as a union, that is, select all resources that match any of them.
- [DONE] Add an `instance-of` parameter that takes class identifiers for parity with the original resource.
- [DONE] Clean up slow RDF::Graph methods and push upstream.
- [DONE] Create a Git repository with a prefabricated configuration.
- [DONE] Create a set of "catalogue resources".
- [DONE] Create an "index" resource that enumerates all the windowed resources.
- [DONE] Create an RDF-KV handler.
- [DONE] Create an `asserted` flag that is on by default but can be set to false to eliminate asserted resources from the result set.
- [DONE] Create an `inferred` flag that is on by default but can be set to false that will disable inferencing.
- [DONE] Create an index as a top-level entry point to the catalogues.
- [DONE] Decouple the IBIS front-end from its back-end (into its own Git[hub] repository).
- [DONE] Define a relation that connects the tool to the entry point.
- [DONE] Generate an ARM64 Docker image.
- [DONE] Install Intertwingler on deuce (which is ostensibly a much, much faster machine than that piddly little VPS on DigitalOcean) and just reverse-proxy it over the VPN.
- [DONE] Make the Intertwingler home directory a Docker volume.
- [DONE] Make the inventory resource respond to an `in-domain-of` parameter for symmetry with `in-range-of`.
- [DONE] Perform RDFS/OWL inferencing against the supplied parameters and result set.
- [DONE] Port the IBIS tool to Intertwingler.
- [DONE] Put Sense Atlas on its own website, senseatlas.net.
- [DONE] Put Sense Atlas online so the client can observe its development.
- [DONE] Put client extranets into Intertwingler.
- [DONE] Put the `in-range-of` property into parameter.
- [DONE] Ship Intertwingler as a Docker image.
- [DONE] Ship an Intertwingler instance that the client can use.
- [DONE] Upgrade DigitalOcean server to handle the additional load introduced by Intertwingler.
- [DONE] Write a getting-started tutorial.
- [RESOLVED] As it stands, every site managed by Intertwingler can see the data from every other site.
- [RESOLVED] Intertwingler currently does not have the resources to furnish the IBIS tool with fodder for autocompletes.
- [RESOLVED] The front-end code to the IBIS tool is currently in the back-end repository.
- `RDF::Vocabulary.find` is a major culprit. It is used in `Intertwingler::Resolver#coerce_resource` and does a sequential scan over hundreds of vocabs every time a lookup is done.
- doriantaylor.com only needs a basic notion of "private" (only visible to me) and "public".