There are free demos available on the store page for Windows, Linux, and Nintendo DS, so Cobalt can still be used by people who can't afford $5. Charging a higher price would make the program a lot less accessible, and charging nothing would be unsustainable, I can't live off attention alone.
The second revision of the Bedrock specification changed very little of consequence, but I should probably add a change note to the spec. The structure of the document was reworked, becoming a single long document instead of multiple smaller ones; some sections were reworded for clarity; the behaviour in some underspecified edge cases was made explicit; the assembler specification was cleaned up; reading from the action port on the file device returns error state instead of success state. I don't expect that any existing programs other than Cobalt will have been affected. The bedrock-nds emulator uses the second screen to run a second Bedrock program (the on-screen keyboard), it isn't exposed to the main program.
My intention for Bedrock is that it will never change going forwards, other than for minor clarifications, so that existing programs will continue to work on any emulator indefinitely. Have you played around with Bedrock before, or written a program for it?
I haven't written any Bedrock assembly yet. But a few months ago I read the spec and bookmarked it as something interesting. I am toying with the idea of creating a disassembler for it now.
Oh cool, that'd be great! I'm wanting to write a disassembler in the future as part of a dedicated debugger for Bedrock programs. If you end up writing down any plans I'd love to take a look or help you out, my email address is ben@derelict.engineering.
Thank you! The customizable tools give a surprising amount of power to the user for a relatively small amount of work (just a couple of basic editing screens). The most interesting outcome of all this is that the scatter and spacing parameters work equally well on the bucket fill tool as they do for regular brushes, allowing you to emulate white noise and similar when filling large areas.
The sketch layer is accessed with the 'eye' button in the bottom-right corner of the canvas screen. Clicking that button toggles visibility of the sketch layer and reveals three more buttons, and clicking the newly-revealed pencil button toggles drawing to the sketch layer instead of the canvas.
The decision to implement only two layers for Cobalt was a conscious one. The design of Cobalt is focused towards speeding up the user and helping them to finish their images, and I found that being able to go back and tweak each layer made it more difficult to commit to a final image.
Thank you! I'm really passionate about exploring this direction of computing, digging around in a bargain bin of discarded futures to find ideas worth pursuing.
This would be fascinating to see, I have no idea how you'd even start.
There was a video I saw a couple of years back that was showcasing a cellular programming model, where each cell in a two dimensional grid performed an operation on values received from its neighbours. Values would move into one side of a cell and out the other every tick, something like Orca (by 100 rabbits), so the whole thing could be parallelised on the cell level very easily.
I had a hard time figuring out whether Bedrock counted as an 8-bit or 16-bit computer, because it doesn't line up so cleanly with the usual criteria as does a physical CPU. I decided that the 8-bit label fitted best because it has an 8-bit data path, an 8-bit instruction format, and the stacks hold only bytes. It also has a 16-bit address space and can perform 16-bit arithmetic, but so can the well-known 8-bit Z80 processor.
The usual meaning of "data path" https://en.wikipedia.org/wiki/Datapath is the path from the register file (and/or memory access unit) to the ALU and back to the register file (and/or memory access unit). So we could say that both the 8088 and the 68000 had a 16-bit data path, because they used 16-bit buses for those transfers and a 16-bit ALU, even though the 68000 had 32-bit registers and the 8088 had an 8-bit data bus to connect it to RAM. The 68020 implemented the same instructions and registers as the 68000 (and additional ones) but used a 32-bit data path, so they were twice as fast.
In what sense does a virtual machine instruction set architecture with no hardware implementation have a "data path" separate from its arithmetic size? You seem to be using the term in a nonstandard way, which is fine, but I cannot guess what it is.
By your other criteria, the (uncontroversially "16-bit") 8088 would be an 8-bit computer, except that it had a 20-bit address space.
By data path, I mean the width of the values that are read from the stacks, program memory, and device bus. Pairs of 8-bit values can be treated as 16-bit values in order to perform wider arithmetic, but all data ultimately moves around the system as 8-bit values.
Whether 16-bit data moves around the system as 8-bit values or not sounds like a question about the implementation, not the architecture (the spec).
For example, the spec says, "Reading a double from program memory will read the high byte of the double from the given address and the low byte from the following address," but I'd think that generally you'd want the implementation to work by reading the whole 16-bit word at once and then byte-swapping it if necessary, because that would usually be faster, and there's no way for the program to tell if it's doing that, unless reading from the first byte has a side effect that changes the contents of the second byte or otherwise depends on whether you were reading the second byte at the same time.
(Of course if you have a "double" that crosses the boundaries of your memory subsystem's word, you have to fall back to two successive word reads, but that happens transparently on amd64 CPUs.)