Don't be surprised when the answer is "not much". Apply supply and demand to electric power generation. If your grid rate is getting hiked then so is the market price of used solar.
Texas State Bar is still a thing. This means that it has split from the American Bar Association, but the legal system of Texas is still part of the US Legal system.
> Texas State Bar is still a thing. This means that it has split from the American Bar Association, but the legal system of Texas is still part of the US Legal system.
Lawyer here, member of Texas and California bars. There seems to be a misunderstanding here:
1. A state bar is what a lawyer has to belong to in order to practice regularly in that state (with some exceptions, e.g., for federal-court practice). Example: To practice regularly in California, a lawyer must be a member of the State Bar of California. That normally requires passing a bar exam or (in some states if you're an experienced lawyer), getting in by "reciprocity."
AFAIK, every state bar is separately regulated by the highest court of the state (and, sometimes, by state statute). Example: The State Bar of Texas is subject to regulations promulgated by the Supreme Court of Texas.
2. In contrast, The ABA is a purely-voluntary private association of lawyers. A lawyer doesn't have to belong to the ABA in order to be a lawyer or practice law.
3. IIRC, the ABA's governing body includes liaisons from state bars. But AFAIK, there's never been any official governing connection between the ABA and any state bar.
4. The ABA's law-school accreditation standards [0] are a way for states to adopt uniform standards, thus avoiding the cost of developing individual standards (and of complying with a variety of standards). Those ABA standards are roughly analogous to national model building codes for plumbing, etc. — they're adopted by various jurisdictions but have little or no legal standing in any given jurisdiction unless adopted.
Do you think malware creators find out by reading HN or github? I don't understand the vitriol, the request "Github should take a harder stance" could have a chilling effect on security researchers, pushing high impact exploits deeper underground.
There isn't vitriol, or atleast I didn't mean it that way. The point I was trying to make is that I've seen malicious code like viruses and keyloggers and rootkits being distributed via github and they use the 'this is for education' as a cop-out when the rest of the repo makes it extremely obvious what the real intention is
Malware is very easy to build. Competent threat actors don't need to rely on open source software, and incompetent ones can buy what they use from malware authors who sell their stuff in various forums. Concerns similar to yours about 'upgrading' the capabilities of threat actors were raised when NSA made Ghidra public, yet the NSA considers the move itself to have been good (https://www.nsa.gov/Press-Room/News-Highlights/Article/Artic...).
People will build malware. It is actually both fun and educational. Them sharing it makes the world aware of it, and when people are aware of it, they tend to adjust their security posture for the better if they feel threatened by it. Good cybersecurity research & development raises the bar for the industry and makes the world more secure.
Have you ever heard the phrase:
"To stop a hacker you have to think like a hacker."
Thats cyber security 101. Without tthe hackers knowledge or programs...you're just a victim or target. But, with this knowledge made available, now you are aware of this program/possibility. Its like when companys deploy honeypot servers to capture the methods & use cases of hackers attacking the server, to build stronger security against their methods and techniques.
If you want to see a comparison against an even broader set of open source compression algos, this is lzbench (it's linked directly from the ZXC github page)
lzbench has added ZXC to its suite. This makes a nice apples to apples comparison possible.
"The QWERTY layout became popular with the success of the Remington No. 2 of 1878...
"The 0 key was added and standardized in its modern position early in the history of the typewriter, but the 1 and exclamation point were left off some typewriter keyboards into the 1970s."
There's always a few oddball variations. But desk work will probably use a qwerty keyboard in the year 2100
A college level approach could look at the line between Math/Science/Physics and Philosophy. One thing from the article that stood out to me was that the introduction to their approach started with a problem about classifying a traffic light. Is it red or green?
But the accompanying XY plot showed samples that overlapped or at least were ambiguous. I immediately lost a lot of my interest in their approach, because traffic lights by design are very clearly red, or green. There aren't mauve or taupe lights that the local populace laughs at and says, "yes, that's mostly red."
I like the idea of studying math by using ML examples. I'm guessing this is a first step and future education will have better examples to learn from.
> traffic lights by design are very clearly red, or green
I suspect you feel this because you are observing the output of a very sophisticated image processing pipeline in your own head. When you are dealing with raw matrixes of rgb values it all becomes a lot more fuzzy. Especially when you encounter different illuminations, exposures and the cropping of the traffic light has noise on it. Not saying it is some intractably hard machine vision problem, because it is not. But there is some variety and fuzzyness there in the raw sensor measurements.
That HDMI Forum does not allow TVs to be sold with DisplayPort is a massive reason I think they deserve to have their building surrounded by angry people with pitchforks and torches. Anti-competitive abusers, doing awful things to prevent a better world.
DisplayPort actually makes sense as a digital protocol, where-as HDMI inherits all the insane baggage of the analog past & just sucks. HDMI is so awful.
No, they don't put DP on because every $ of hardware they fit to the TV needs to provide value. DP requires a large board component that may need manual handling, circuit traces (+ decoupling) and silicon on the chip to interface. It then requires software support in the stack and that needs testing/validation.
The percentage of people who will actually use DP to connect their TV vs HDMI is tiny. Even people who do have DisplayPort on their monitors will often times connect it with HDMI just because it's the more familiar connector. I spent a decade working in that area and we literally were debating about spending cents on devices that retailed for hundreds, or thousands. The secondary problem that drives that is that ~90% of TVs sold use the same family of chips from MStar, so even if you wanted to go off-track and make something special, you can only do it from off-the-shelf silicon unless you pay a fortune for your own spin of the silicon. If you want to do that then you better commit to buying >1m chips or they won't get out of bed.
HDMI forum was founded by mostly TV manufacturers, they're not interested in constraining the market in that way. It's all just been market consolidation and making TVs cheaper through tighter integration.
Oh wow, that explains a lot, I sort of always figured it was just market momentum that meant you never see tv's with a display port. sort of like
... we need a digital video link
VESA develops DVI
... market gap for tv's identified
hdmif develops HDMI which is DVI with an audio channel
... while technically a minor feature that audio link was the killer feature for digital tv's and led to hdmi being the popular choice for tv's
VESA develops displayport a packet(vs streaming for DVI and hdmi) based digital link, it's packet nature allows for several interesting features including sending audio, and multiple screens.
... no tv's use it, while display port is better than hdmi it is not better enough to make a difference to the end user and so hdmi remains normal for tv's, you can find a few computer monitor with DP but you have to seek them out.
I will have to see if there is some sort of stupid "additional licensing cost" if a tv is produced with displayport, that would explain so much. I don't claim that there are no tv's with DP but I certainly have never seen one.
The writeup on phys.org is troublesome at best. Starting with the Ming Hsieh Department of Electrical and Computer Engineering, it buries the rest of that sentence in paragraph 5: USC (University of Southern California) and the Abbe Center of Photonics, Friedrich Schiller University Jena, Germany.
This team has made a nonlinear lattice that relies on something they call "Joule-Thomson-like expansion." The Joule-Thomsen effect is the ideal gas law in beginning science. PV=nRT. Compression heats a gas, expansion cools a gas.
Why they're studying the equivalent photonics principle [1] is that it focuses an array of inputs, "causing light to condense at a single spot, regardless of the initial excitation position." Usually the problem is that light is linearly independent: two beams blissfully ignore each other. To do useful switching or compute, one of the beams has to be able to act as a control signal.
A photon gas doesn't conserve the number of particles (n) like beginning physics would suggest. This lets the temperature of the gas control the output.
The temperature, driven by certain specific inputs, produces the nonlinear response. I didn't see a specific claim what gain they achieved.
This paper is more on the theoretical end of photonics research. Practical research such as at UBC Vancouver [2] where a device does "weight update speed of 60 GHz" and for clustering it can do "112 x 112-pixel images" - the tech doesn't compete well against electronics yet.
TSMC and NVidia are attempting photonics plays too. But they're only achieving raw I/O with photons. They can attach the fiber directly to the chip to save watts and boost speeds.
Basic physics gets in the way too. A photon's wavelength at near UV is 400 nanometers, but the transistors in a smartphone are measured at 7 nanometers ish. Electrical conduction is fundamentally smaller than a waveguide for light. Where light could maybe outshine electrons is in switching speed. But this research paper doesn't claim high switching speed.
Minor nit, Joule-Thomson is not just the ideal gas law - it is a separate thermodynamic effect entirely. Case in point, for certain gases the change in temperature due to Joule-Thomson has the opposite sign that you would predict from the ideal gas law alone.
This has interesting applications. For example, you can exploit this with dilute metal vapor in an expanding helium gas to cool the metal vapor to very low temperature - the Joule-Thomson expansion of helium increases the helium's temperature by converting the energy of the intermolecular forces into heat. This draws out energy from the metal vapor. If done in a vacuum chamber, then in the region before the shockwave formed by the helium, the supercooled metal atoms will form small van der Waals clusters that can be spectroscopically probed in the jet. This was an interesting area of study back in the 80s that advanced our understanding of van der Waals forces.
Light doesnt interact with itself directly without a third non-light partner. So yes the light of course needs to interact with lattice made of atoms to make any switching possible here. This is why we can see light from the stars though it had to travel through other light for millions of years.
> TSMC and NVidia are attempting photonics plays too.
It's probably been six years since I looked at this space. The problem at the time for TSMC and several other people was that their solutions worked fairly well for firing photons vertically out of the chip and not well at all for firing them horizontally through the chip. I don't know if in the short term and mid term if an optical PCIe or memory bus is more overall horsepower than faster cross-chip communication in CPUs. But the solutions they were still chasing back then were good between chips, maybe between chiplets. Which could still be an interesting compromise.
> 400 nanometers, but the transistors in a smartphone are measured at 7 nanometers ish
The best em sensors need to be at least 1/10th the length of the frequency they are sending/receiving right? 40 nm isn't awful but it does suggest light for communication between functional units, rather than for assembling them.
It's true that the transistors are on the order of 50nm, but the conduits for getting the electrons to those transistors are presumably a bit smaller than that.
Probably not 7nm small, but not the full 50 nm either.
As a result, more used solar should become available on ebay. I'm excited to see what I can do on a shoe string budget.
reply