Some words on the entrenched anti-leeching culture of the Web:

"Hot-linking" – now called Cross-Origin Resource Sharing (CORS) – is the practice of using resources such as images, fonts, stylesheets, scripts – even entire Web pages/apps that are hosted on one server in Web pages that are hosted on another server.

The Web has historically been antagonistic to hot-linking. This made a lot of sense in the 1990s, when a personal Web site might have an austere bandwidth cap of, say, 10MB per month. You'd create a cool Web page with some low-res GIFs, and your total page load might amount to 100KB – which meant that 100 people could visit your page in a month before you'd use-up your bandwidth quota. That was all fine and good – until some unscrupulous "leeches" would hot-link your images into their Web pages. Then, you'd be blowing your bandwidth quota in a week, serving images to those OTHER people's pages, and your site would be unavailable for three weeks out of the month! Rude!

Now, some people may not have noticed, but it is not the 1990s anymore.

The "anti-leeching" stance is no longer well-supported by reason of resource limitations. Unless, of course, you're hosting many gigabytes of 4K video from your site – but I don't imagine you'd be serving that from a low-cost shared hosting account, or a free one. And yet, the attitude persists. People don't like their bandwidth being leeched.

The xanalogical model is built on leeching.

Hot-linking by another name is transclusion. Xanalogical documents are transclusion-based documents. A xanalogical server must be ready and willing to serve media resources out to anyone that requests them, regardless of what document they're being requested for use in, and wherever those documents may be hosted.

Now, the "anti-leeching" rationale behind CORS restrictions is not the only one. In fact, it's not REALLY the primary rationale.

The real reason that people don't like to share resources across domains is: authors and rights-holders need you to view their material on their site, and only their site, because this is the only way that they can secure any revenue from it.

If they generate revenue from ad impressions, they need you to be on their Web site to get those ad impressions.

If they generate revenue from product sales, they need you to be able to see their promotions and to visit their eshop while viewing their material.

If they generate revenue from some kind of pay-for-access system (paywalling) then they need you to only be able to exclusively access their work through their site.

There is tremendous value to be derived from exclusivity, and people will go to great lengths to protect that value.

The perceived cons of leeching need to be weighed against the benefits of transclusion. Transclusion facilitates DISCOVERY, which has a cultural value that, from my perspective, outweighs the challenges that it poses to monetization of content and the potential for resource overuse that would result in higher hosting costs (or availability outages) for small-scale creators. However, the current state of discoverability on the Web is ...sub-optimal, which diminishes its value and makes it more difficult to make the case for universal transclusion.

As much as I wish I could, I can't just wave my hands a say, hey, anti-leeching sentiment is stupid, let's forget it!* I, too, am a person living in the real world, with bills to pay and children to feed, so I recognize the fact that controlling one's intellectual property and leveraging fees against its use is not just "good business" but NECESSARY business.

Solutions?

Project Xanadu has historically promoted the idea of micropayments as an answer to the question of "how do authors get paid?" and transcopyright as the answer to "how is licensing going to work?". They're both worth understanding, at least at a superficial level, so I encourage anyone interested to at least start with the explainer pages on this site.

And what solutions does the Alph project offer? None, folks! But I can give you my perspective on this issue as a freelance illustrator.

The bulk of my income comes from new commissions, not licensing. So I view the propagation of my artwork online as promotion, and as having a net-positive effect on my business – as long as people can find-out who the illustrator is! For me, the value in discovery outweighs the potential value of licensing revenue for re-use of my artwork online. I sometimes also make money from selling print editions of my illustrations, and in order for people to find their way to my e-shop, I need them to find a way to my Website/social media – again, I need them to be able to discover the work's source.

In my not-so-humble opinion, the people who are most proactive about restricting access to copyrighted material are RIGHTS-HOLDERS, not necessarily authors. For people whose business is not centered on the production of creative works, but rather the exploitation of the copyrights of those works, well, it stands to reason that they would lobby against a system that allowed anyone, anywhere, to re-publish the material that they must restrict access to in order to exist.

And what about the "bandwidth theft" issue for someone like me?

I pay a modest annual fee for Web hosting. I have a bandwidth allowance of about 75GB per month that I don't get anywhere near utilizing fully. But maybe that's just because so much of my work is hosted on other servers. All of my social media is hosted by others, and the commissioned work that I do lives on the servers/CDNs of the commissioning publication.

So let's imagine a plausible real-world scenario in a xanalogical future Web: I've been hired to do an illustration for an entertainment magazine discussing the production setbacks of a highly-anticipated upcoming film. Because this is a xanalogical world, when people visit the magazine's Website to read the article, the illustration is being transcluded from MY server. Let's assume the worst-case scenario: it's a full-pager, and the magazine requires that I deliver it at print resolution. We'll assume this is a complex composition with lots of detailed colour work, and so we've got a JPEG image that clocks-in at, say, 5MB. This is a big magazine, and there's a lot of interest in the film, so the article gets ...a million hits. Is my little shared hosting plan going to allow me to transmit 5 terabytes of data to satisfy all of these transclusions? Can my server even handle all of those requests?

No. Not a little shared hosting account like I currently use.

But, if this was the publishing model I had to operate under, I'm sure my setup would be different. I might be paying for a dedicated hosting plan with a bandwidth cap of >10TB, and I'd be billing my clients appropriately to cover my $200+/mo. hosting costs.

I believe, however, that there are lower-cost technical solutions to this. Xanalogical content delivery networks could absolutely be a thing. They would transmit the necessary origination and metadata information when satisfying requests so that the discovery of a media resource's author, original context, etc. was still easy and universal.

With Alph, in its current implementation, I'm approaching the leeching issue from a "small web", self-publishing point-of-view. The bandwidth that I and others have access to is ample for this application, and the motives for creation in this scene are not chiefly commercial. Yes, this side-steps the issue by saying "I'm not here to make money, I'm here to make art," and I can understand how that's not a satisfactory stance for many people – but it's my stance.

I think the best way to wrap this up is by quoting something that I wrote in my journal back when I started this project in 2016:

Commerce is inevitable, culture is incidental. We don't have to fight for commerce, we have to fight for culture — unless, that is, we're happy with a culture of commerce.