There is no centralised part of git. He's trying to decentralise Git- Hub, which is a centralised website that has many feautres, issues, projects, etc. + acts as a hosted Git remote.
The problem with Git decentralization is that you need a publically accessible endpoint (meaning there is an eventual central host). Even when you are using email for patch files, your email becomes the central point of failure (even if you host that yourself). So far as fetch goes, this is why git.kernel.org exists.
Git behaves like an mp3-sharing website with multiple mirrors, where-as gittorrent behaves like magnet URLs. This protocol effectively obsolesces the requirement for remotes.
Not sure what you mean by this. The article mentions and compares itself to Github throughout.
> The problem with Git decentralization is that you need a publically accessible endpoint (meaning there is an eventual central host).
I disagree slightly here. It's completely possible to build a network of remotes that don't all reference a single central origin (e.g. teammates referencing eachothers' local repos over authenticated connections, possible on a LAN, etc.). This gets messy and is hard to administer securely, but Git is more than capable. Also, the challenges of doing this Gittorrent securely with private repos seem similar to those of using SSH remotes between individual dev machines.
In terms of the article talking about "hubs" (Github specifically), Gittorrent purports to replace the need for Github, but only fills one of the uses of a hub. A hub like Github serves two purposes:
1. a central origin, i.e. the same use git.kernel.org serves
2. a searchable/discoverable network, e.g. the same use the central thepiratebay.org search engine serves for Torrents. Decentralising this seems like a hard problem.
As someone who lives on linux and has to touch OSX for things like building for iOS devices, I find it really odd that OSX doesnt just solve these problems like all the FOSS distros do. It's insanely odd that Apple gives me a bash shell which is years out of date, and it feels hacky to just use what i want. I know Propietary and free software kinda have a hard time co-existing on operating systems built on free or mostly free software, but if OSX is half free software and its posix compliant, Its hard to believe that apple couldn't give you both propietary software and an easy package system for developers easily.
The sad thing about linux is that as much as I love it and its ecosystem, i cant recommend it to anyone who wants things to "just work":
- X and wayland crash on me all the time on this laptop because of its HiDPI screen and my kinda-works-but-is-wonky fixes to work with multiple monitors.
- Hardware support is the best its ever been, but graphics cards, wifi, exotic devices, laptop power states and embedded devices can still be a pain because manufacturers simply dont care.
- desktop applications can still be a little glitchy, Web browsers work fine, as do first party DE apps, but the more you get away from things which arent in the big name gui toolkits and have custom controls and behaviour, the more problems you seem to run in.
But aside from all that im happy here in ubuntu. When your software library feels as easy as picking a book from a shelf and 90% of the system updates by the update manager and says "hey restart when you feel like it" Im quite comfortable.
Apple wants to use open source code, but doesn't want to make their proprietary code open or to license patents to anyone who wants to build off of their code.
Ultimately apple will sell developer devices at the price point of their Mac pro devices that come with all the approved tech you are allowed to use to build end user services/apps. In addition to the high purchase price you will have to sign up for a developer account and pay annually.
Everyone else will buy consumer oriented devices that are as open as the ipad.
It's not a bicycle for the mind its a train and if you dress appropriately and pay your fee you may set up a concession stand on the route.
Todays equivalent would be those Funko PopVinyl figures. I'm amazed at how many people thìnk they are worth something when theres hundreds of rudimentary looking figures for every cult favorite movie, videogame, and cartoon.
The hype around rapidly increasing values on Pop Vinyls has largely gone, the market seems to have chilled and the reason it hasn't crashed likely to be because of the value attached to the licenses.
The most interesting part of the whole thing I found was the effect valuing sites had on the market (i.e. poppriceguide). They significantly lowered the effort in determining value for existing collectors and attracted a lot of people whole wouldn't normally collect to get into it purely for the profit to be gained. I am not suggesting others wouldn't have been attracted to the market without them but having the data readily available rather than having to trawl eBay completed sales lowered the barrier of entry.
That's a terrible restriction for people who have many virtual desktops. I've used 10 in two rows for years (or 9 in xmonad) so that's my mental model of where everything is located.
Not surprised to see a modem/router from Australia's biggest Telco on the list.
Australia's internet is in such a confusing state. At the moment a house could potentially get internet via one of ADSL1/2/2+, VDSL2+, HFC, Fiber, Wireless or satellite and on top of that most telcos are shipping woefully bad modems/router hardware with equally bad, gimped, and undersupported firmware to consumers.
Sounds like Norway basically. Thankfully the local powerco decided to roll out fiber in my area recently, and i think some 99% of the households jumped onboard.
Before then it was all about ADSL2+ (or some such) over a aging copper pair.
I definitely don't think you are wrong but in practical purposes, "curated" doesn't mean what it used to and I think we can all suffer less when we accept that. It's probably one of the prices society pays for the internet and doesn't have to be a bad thing.
Did you try a later kernel? out of the box ubuntu 16.04 is on the longterm 4.4 kernel. It looks like some ryzen features and patches have been added to 4.10, and they probably were not back-ported to the longterm kernels.
Not OP, but just built at Ryzen 1600 box. I've had _more_ instability when running 17.04 and settled on 16.04 which has been mostly fine, but hard crashes occasionally.
I custom built a machine last week for the first time in over a decade and from a fairly new processor and a just released video card. If I had more confidence in my PC building I guess I'd be more upset, but there are a lot of variables here and I'm still in a honeymoon phase.
I installed the official AMD RX580 drivers and it's been stable since.
Nothing out of the ordinary with a new platform. While Linux often takes longer to work this stuff out, Windows often has had similar issues in the past as well.
Having experienced first hand AMD driver instability on both Windows & Linux, AMD have lost my custom for the next 10 years..
Multiple re-occurring driver crashes using their main graphics card product line(RX380) on Windows 10..
So, pretty mainstream and yet having driver crashing (even when doing non-intensive tasks e.g. web browsing)
For the record, I'm not so sure Nvidia is any more stable either.
The only (constantly) stable graphics provider over the years has been Intel's on-board graphics.
I'm hoping that they do bring it back, because Nintendo did see the potential and have dev tools for the WiiU to target html based content - https://developer.nintendo.com/tools
If you use the Minimal CD for installing ubuntu (https://help.ubuntu.com/community/Installation/MinimalCD) you can pick a metapackage along the way that may be more to your liking. I dont know what exactly you dont want from a standard xubuntu install but there is an xfce "minimal" metapackage in the ecosystem which may help you install only what you need from the start