• 2 Posts
  • 120 Comments
Joined 7 days ago
cake
Cake day: September 14th, 2025

help-circle
  • I mean, I’m listing it because I believe that it’s something that has some value that could be done with the information. But it’s a “are the benefits worth the costs” thing? let’s say that you need to pay $800 and wear a specific set of glasses everywhere. Gotta maintain a charge on them. And while they’re maybe discrete compared to a smartphone, I assume that people in a role where they’re prominent (diplomacy, business deal-cutting, etc) probably know what they look like and do, so I imagine that any relationship-building that might come from showing that you can remember someone’s name and personal details (“how are Margaret and the kids?”) would likely be somewhat undermined if they know that you’re walking around with the equivalent of your Rolodex in front of your eyeballs. Plus, some people might not like others running around with recording gear (especially in some of the roles listed).

    I’m sure that there are a nonzero number of people who would wear them, but I’m hesitant to believe that as they exist today, they’d be a major success.

    I think that some of the people who are building some of these things grew up with Snow Crash and it was an influence on them. Google went out and made Google Earth; Snow Crash had a piece of software called Earth that did more-or-less the same thing (albeit with more layers and data sources than Google Earth does today). Snow Crash had the Metaverse with VR goggles and such; Zuckerberg very badly wanted to make it real, and made a VR world and VR hardware and called it the Metaverse. Snow Crash predicts people wearing augmented reality gear, but also talks about some of the social issues inherent with doing so; it didn’t expect everyone to start running around with them:

    Someone in this overpass, somewhere, is bouncing a laser beam off Hiro’s face. It’s annoying. Without being too obvious about it, he changes his course slightly, wanders over to a point downwind of a trash fire that’s burning in a steel drum. Now he’s standing in the middle of a plume of diluted smoke that he can smell but can’t quite see.

    It’s a gargoyle, standing in the dimness next to a shanty. Just in case he’s not already conspicuous enough, he’s wearing a suit. Hiro starts walking toward him. Gargoyles represent the embarrassing side of the Central Intelligence Corporation. Instead of using laptops, they wear their computers on their bodies, broken up into separate modules that hang on the waist, on the back, on the headset. They serve as human surveillance devices, recording everything that happens around them. Nothing looks stupider, these getups are the modern-day equivalent of the slide-rule scabbard or the calculator pouch on the belt, marking the user as belonging to a class that is at once above and far below human society. They are a boon to Hiro because they embody the worst stereotype of the CIC stringer. They draw all of the attention. The payoff for this self-imposed ostracism is that you can be in the Metaverse all the time, and gather intelligence all the time.

    The CIC brass can’t stand these guys because they upload staggering quantities of useless information to the database, on the off chance that some of it will eventually be useful. It’s like writing down the license number of every car you see on your way to work each morning, just in case one of them will be involved in a hit-and-run accident. Even the CIC database can only hold so much garbage. So, usually, these habitual gargoyles get kicked out of CIC before too long.

    This guy hasn’t been kicked out yet. And to judge from the quality of his equipment – which is very expensive – he’s been at it for a while. So he must be pretty good.

    If so, what’s he doing hanging around this place?

    “Hiro Protagonist,” the gargoyle says as Hiro finally tracks him down in the darkness beside a shanty. “CIC stringer for eleven months. Specializing in the Industry. Former hacker, security guard, pizza deliverer, concert promoter.” He sort of mumbles it, not wanting Hiro to waste his time reciting a bunch of known facts.

    The laser that kept jabbing Hiro in the eye was shot out of this guy’s computer, from a peripheral device that sits above his goggles in the middle of his forehead. A long-range retinal scanner. If you turn toward him with your eyes open, the laser shoots out, penetrates your iris, tenderest of sphincters, and scans your retina. The results are shot back to CIC, which has a database of several tens of millions of scanned retinas. Within a few seconds, if you’re in the database already, the owner finds out who you are. If you’re not already in the database, well, you are now.

    Of course, the user has to have access privileges. And once he gets your identity, he has to have more access privileges to find out personal information about you. This guy, apparently, has a lot of access privileges. A lot more than Hiro.

    “Name’s Lagos,” the gargoyle says.

    So this is the guy. Hiro considers asking him what the hell he’s doing here. He’d love to take him out for a drink, talk to him about how the Librarian was coded. But he’s pissed off. Lagos is being rude to him (gargoyles are rude by definition).

    “You here on the Raven thing? Or just that fuzz-grunge tip you’ve been working on for the last, uh, thirty-six days approximately?” Lagos says.

    Gargoyles are no fun to talk to. They never finish a sentence. They are adrift in a laser-drawn world, scanning retinas in all directions, doing background checks on everyone within a thousand yards, But the next time the laser darts into his face, it scatters off a million tiny, ashy particulates and reveals itself as a pure geometric line in space, pointing straight back to its source.seeing everything in visual light, infrared, millimeter. wave radar, and ultrasound all at once. You think they’re talking to you, but they’re actually poring over the credit record of some stranger on the other side of the room, or identifying the make and model of airplanes flying overhead. For all he knows, Lagos is standing there measuring the length of Hiro’s cock through his trousers while they pretend to make conversation.

    I think that Stephenson probably did a reasonable job there of highlighting some of the likely social issues that come with having wearable computers with always-active sensors running.


  • tal@olio.cafeto3DPrinting@lemmy.worldExplain that
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    2 hours ago

    kagis

    It does sound like there are people who have been working on synthesizing spider silk for some time. So maybe we’ll get there in our lifetimes.

    https://old.reddit.com/r/askscience/comments/qiy6x/what_is_keeping_us_from_making_synthetic_spider/

    What is keeping us from making synthetic spider silk?

    Hey, I can tackle this one because I work in a lab where we ARE making synthetic spider silk.

    First off, the collection of natural silk or the farming of spiders is difficult on a large scale. This is due to spiders being cannibalistic and territorial. So what we’ve done is create transgenic organisms that create the spider silk proteins for us. These organisms include goats, silkworms, bacteria and alfalfa.

    Problems still exist overall. For example, for every organism, except silkworms, we must spin the protein fibers ourselves. This is the current bottleneck in the production line. After the long process of protein purification, the proteins are dissolved in an organic solvent, and pushed through a long thin needle into an alcohol coagulation bath. The fibers are then treated by different methods to try to increase the strength further. Currently, we can take 1 gallon of goats milk and purify between 1 and 10 grams of protein. From 1 gram of protein we an spin hundreds of meters of silk. The silk is not as strong as the native silk, but stronger than Kevlar and silkworm silk. We are currently working on optimizing this procedure, as well as up-scaling it.

    The other promising organism is the transgenic silkworms. The benefit of the silkworms is they spin the fibers for us. The most recent data show that a fiber containing 5% spider silk proteins increase the strength of the silkworm silk by 50%. If we can increase the amount of protein in the silkworms, it may be the most promising way to produce large amounts of silk, due to the infrastructure for silk manufacturing already existing for silkworm cocoons.

    Currently, I am working on a couple of projects. One is mixing different ratios of silkworm silk and spider silk (created from bacteria), and finding the changes in mechanical strengths. It is unlikely we can go much higher than 20% spider silk proteins with out competently knocking out the silkworm genes altogether (which may be a future project). Another project I am working on is trying to create a human ACL from transgenic silkworm silk/spider silk fibers. We will be cabling and braiding the fibers in different way to find the best method of creating ligaments.

    So, in closing, we are making synthetic silks; however, only in the lab. Once the technology is optimized, it will be moved into industry and many different applications may come from it.

    https://www.science.org/content/article/black-widows-spin-super-silk

    The silk of the humble spider has some pretty impressive properties. It’s one of the sturdiest materials found in nature, stronger than steel and tougher than Kevlar. It can be stretched several times its length before it breaks. For these reasons, replicating spider silk in the lab has been a bit of an obsession among materials scientists for decades.

    Now, researchers at the University of Cambridge have created a new material that mimics spider silk’s strength, stretchiness and energy-absorbing capacity. This material offers the possibility of improving on products from bike helmets to parachutes to bulletproof jackets to airplane wings. Perhaps its most impressive property? It’s 98 percent water.

    “Spiders are interesting models because they are able to produce these superb silk fibers at room temperature using water as a solvent,” says Darshil Shah, an engineer at Cambridge’s Centre for Natural Material Innovation. “This process spiders have evolved over hundreds of millions of years, but we have been unable to copy so far.”

    The lab-made fibers are created from a material called a hydrogel, which is 98 percent water and 2 percent silica and cellulose, the latter two held together by cucurbiturils, molecules that serve as “handcuffs.” The silica and cellulose fibers can be pulled from the hydrogel. After 30 seconds or so, the water evaporates, leaving behind only the strong, stretchy thread.

    The fibers are extremely strong – though not quite as strong as the strongest spider silks – and, significantly, they can be made at room temperature without chemical solvents. This means that if they can be produced at scale, they have an advantage over other synthetic fibers such as nylon, which require extremely high temperatures for spinning, making textile production one of the world’s dirtiest industries. The artificial spider silk is also completely biodegradable. And since it’s made from common, easily accessible materials – mainly water, silica and cellulose – it has the potential to be affordable.

    Shah and his team are far from the only scientists to work on creating artificial spider silk. Unlike silkworms, which can be farmed for their silk, spiders are cannibals who wouldn’t tolerate the close quarters necessary for farming, so turning to the lab is the only way to get significant quantities of the material. Every few years brings headlines about new inroads in the process. A German team has modified E-coli bacteria to produce spider silk molecules. Scientists at Utah State University bred genetically modified “spider goats” to produce silk proteins in their milk. The US army is testing “dragon silk” produced via modified silkworms for use in bulletproof vests. Earlier this year, researchers at the Karolinska Institute in Sweden published a paper on a new method for using bacteria to produce spider silk proteins in a potentially sustainable, scalable way. And this spring, California-based startup Bolt Threads debuted bioengineered spider silk neckties at the SXSW festival. Their product is made through a yeast fermentation process that produces silk proteins, which then go through an extrusion process to become fibers. It’s promising enough to have generated a partnership with outdoor manufacturer Patagonia.

    But, as a 2015 Wired story points out, “so far, every group that’s attempted to produce enough of the stuff to bring it to the mass market, from researchers to giant corporations, has pretty much failed.”


  • tal@olio.cafeto3DPrinting@lemmy.worldExplain that
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    2 hours ago

    considers

    I think that with thermoplastic, the problem is that you’re extruding a liquid that hardens as it cools. Unless you have very good information about the particular filament used, a very good model of how it acts as it cools, good control over airflow, and good control over (or at least sensors to get a very good awareness of) environmental temperature, you’re going to have a hard time extruding something at precisely the right rate such that it cools into exactly the shape you want. Also, you’re facing the constraint of keeping the thermoplastic in the extruder at the right fluidity. Maybe you could…have the filament be melted, then enter some kind of heated pump…that’d help decouple the rate at which you need to extrude from the temperature at which you want to have the already-extruded material.

    In theory, it’s possible to move a 3D printer’s extruder and extrude at just the right rate such that you could run a line from point A to point B without regard for support. But in practice, I think that current thermoplastic printers would have a long way to go before they could reliably do that.

    That being said…

    A printer that could print in spider silk — or a printer that could print in multiple materials, including spider silk — might have some neat applications.

    https://www.science.org/content/article/black-widows-spin-super-silk

    Need a strong elastic fiber? Try black widow silk. The thread spun by these deadly spiders is several times as strong as any other known spider silk–making it about as durable as Kevlar, a synthetic fiber used in bulletproof vests, according to a report presented here at the annual meeting of the Society for Integrative and Comparative Biology.

    I mean, I’d kind of imagine that you could maybe even use that in some sort of composite, to strengthen other printed things in various ways.

    Now I kind of want a black widow spider silk 3D printer.



  • It’s not clear to me whether-or-not the display is fundamentally different from past versions, but if not, it’s a relatively-low-resolution display on one eye (600x600). That’s not really something you’d use as a general monitor replacement.

    The problem is really that what they have to do is come up with software that makes the user want to glance at something frequently (or maybe unobtrusively) enough that they don’t want to have their phone out.

    A phone has a generally-more-capable input system, more battery, a display that is for most-purposes superior, and doesn’t require being on your face all the time you use it.

    I’m not saying that there aren’t applications. But to me, most applications look like smartwatch things, and smartwatches haven’t really taken the world by storm. Just not enough benefit to having a second computing device strapped onto you when you’re already carrying a phone.

    Say someone messages multiple people a lot and can’t afford to have sound playing and they need to be moving around, so can’t have their phone on a desk in front of them with the display visible or something, so that they can get a visual indicator of an incoming message and who it’s from. That could provide some utility, but I think that for the vast majority of people, it’s just not enough of a use case to warrant wearing the thing if you’ve already got a smartphone.

    My guess is that the reason that you’d use something like this specific product, which has a camera on the thing and limited (compared to, say, XREAL’s options) display capabilities, so isn’t really geared up for AR applications where you’re overlaying data all over everything you see, is to try to pull up a small amount of information about whoever you’re looking at, like doing facial recognition to remember (avoid a bit of social awkwardness) or obtain someone’s name. Maybe there are people for whom that’s worthwhile, but the market just seems pretty limited to me for that.

    I think that maybe there’s a world where we want to have more battery power and/or compute capability with us than an all-in-one smartphone will handle, and so we separate display and input devices and have some sort of wireless commmunication between them. This product has already been split into two components, a wristband and glasses. In theory, you could have a belt-mounted, purse-contained, or backpack-contained computer with a separate display and input device, which could provide for more-capable systems without needing to be holding a heavy system up. I’m willing to believe that the “multi-component wearable computer” could be a thing. We’re already there to a limited degree with Bluetooth headsets/earpieces. But I don’t really think that we’re at that world more-broadly.

    For any product, I just have to ask — what’s the benefit it provides me with? What is the use case? Who wants to use it?

    If you get one, it’s $800. It provides you with a different input mechanism than a smartphone, which might be useful for certain applications, though I think is less-generally useful. It provides you with a (low-resolution, monocular, unless this generation has changed) HUD that’s always visible, which a user may be able to check more-discretely than a smartphone. It has a camera always out. For it to make sense as a product, I think that there has to be some pretty clear, compelling application that leverages those characteristics.


  • I mean, Trump’s a pretty bad president, but under the system, as it stands, if an unjust prosecution happens, the courts are expected to shoot it down. That’s why one has a court system. It shouldn’t fall over just because he demands prosecution of political opponents.

    In Japan, you have a system where prosecuted cases virtually always lead to a conviction, where for practical purposes, the “filter” happens at the decision to prosecute:

    https://www.nippon.com/en/japan-topics/c05401/order-in-the-court-explaining-japan’s-99-9-conviction-rate.html

    MURAOKA: The conviction rate in most countries, including those with plea bargain systems, is generally over 90 percent. Many trials do end in acquittals, though. By comparison, Japan’s 99.9 percent conviction rate is unnaturally high.

    Prosecutors in any country generally pursue cases where they are confident of a positive outcome. However, they are still required to prove the defendant’s guilt beyond a reasonable doubt. Japan’s conviction rate creeping toward 100 percent has raised red flags among legal scholars overseas who question whether judges are actually ruling according to the law or are merely deferring to the prosecution.

    But that’s not how the US works.

    There’s a legitimate issue in that a prosecution can cause a defender to incur legal fees — and maybe it’s the case that we should try to mitigate than more than is the case today. Or maybe nuisance. Trump certainly has managed to fire people in the Executive Branch who he was angry at. But I’m not especially worried that Trump is going to be just running around convicting people of crimes because he doesn’t like them. Trump was prosecuted and convicted because he broke the law. He is, no doubt, pissed off about that. But it doesn’t mean that he can just readily go out and have people convicted who he personally doesn’t like who haven’t broken the law.

    I’d also add that even past judges acting to throw out cases that flagrantly don’t have any merit or to rule in favor of a defendant, even if you could somehow compromise a judge, the common-law system has the right to a jury trial to add yet another barrier to a compromised government attempting to misuse prosecution.

    Finally, there’s the pardon, something that Trump has used himself very vigorously to remove punishment from people who he liked, which can come from a future administration.

    This is something that the system is already designed to handle. It doesn’t need out-of-band involvement.



  • I mean, I think that they had some real data privacy issues with the “screenshot everything by default” stuff.

    However, my bet is that at some point in time, it will be the case that we do wind up in a situation where you do have some kind of system that is processing your data and doing some kind of analysis so that you can make use of it. It may not leave your system (though OS vendors providing storage and “cloud backup” services are a thing with Apple and Google and Microsoft today). But think of, for example, search. On most systems today, there is something that is trawling through your data, building an index, processing it, and putting it in some form via which you can access it. On mine, I don’t run a full-text indexing thing, but those certainly do exist.

    That doesn’t mean that these particular laptops will be what does it, or that MIcrosoft’s CoPilot software — in its present form, at any rate, though if they use it as their “brand” for all their machine learning stuff, it could do all sorts of stuff — will be what uses it, or even that laptops will start doing a lot more local processing in general. But I think that the basic thing that they are shooting for, which is in some way, the PC extracting information from your data and then using it in some way to provide more-useful behavior, probably will be a thing.

    On emacs, by default, M-/ runs dabbrev-expand. That looks through all buffers and, depending upon how you have things set up, possibly some other data sources, and using that as a set of potential completions, tries to complete whatever word is at point in the current buffer. Think of it as sort of a “global tab complete”. That’s a useful feature to have, and it’s a — much simpler — example of something where the system can look at your data and generate something something useful from it via doing some processing. That’s been around for a long time.

    A number of systems will look at one’s contacts list to provide completions or suggestions.

    It’s pretty much the norm — though I presently have it off on my Android phone, because the soft keyboard I use, Anysoft, presently has some bug that causes it to insert bogus text in some applications — on mobile systems to have some kind of autocorrection for text input, and that usually has some kind of “user dictionary”, and at least on some soft keyboards I’ve seen, automatic learning to generate entries for that “user dictionary” based on past text that you’ve entered is a thing.

    Speech recognition hasn’t quite become the norm for interfacing with a computer the way a lot of sci-fi in the past portrayed it (though there are some real hands-free applications for driving), but on systems that do use it, it’s pretty much the norm to learn from past speech and use it to improve current recognition.

    Those are all examples of the system shuffling through your data and using it to improve perfomance elsewhere. They certainly can have data privacy issues, like malicious soft keyboards on mobile OSes having access to everything one types. But they have established themselves. So I kind of suspect that Microsoft’s basic idea of ramming a lot of your data — maybe not everything you see on the screen — into a machine learning system and then using that training to do something useful for you is going to probably be a thing at some point in the future.

    EDIT: There have been situations in the past where a new technology came out and companies tried really hard to find a business application for it, and it never really went anywhere. One example of that is 3D hardware rendering. When 3D hardware came out, outside of CAD and architecture, it was mostly used for video games. There were a some companies who tried figuring out ways to get companies to spend on it because they could do something useful for them. I remember Intel showing rotating 3D charts and stuff. Today, we still mostly use 3D hardware rendering for playing video games, and there isn’t much of a business case for it.

    My guess is that that probably won’t be the fate of with general machine learning running on our data. That is, there may be more data privacy safeguards put into place, or data might not leave our systems, or learning might happen at a different places than based on screenshots, or might not actually run on a laptop, or any number of things. But I suspect that someone is going to manage to produce some kind of system that leverages parallel processing to run a machine learning system that is going to perform tasks that businesses find valuable enough to spend on it.


  • I’m pretty sure that that guide is one of those AI-generated spam sites. In this case, it appears to use a character where the LLM involved wasn’t too sure about whether the character is a house painter or an artistic painter. Which doesn’t mean that the information on it is necessarily wrong, just that I’d be cautious as to errors. If you want information from an LLM, probably better in terms of response quality to just, well, go ask an LLM yourself without the distortion from a spammer trying to have the LLM role-play some character.







  • I believe that the fediverse.observer site can list any Fediverse instance type by number of users (though not active users).

    checks

    Oh, they do do active users.

    https://peertube.fediverse.observer/list

    Looks like the top one is phijkchu.com, at 8074 active users.

    EDIT: There’s also fedidb.com:

    https://fedidb.com/servers

    Choose “PeerTube” as server type, and they’ll give you some data on instances too.

    EDIT2: Note that another way to explore PeerTube, which may be to your taste, is that Google Video indexes PeerTube servers, though I don’t know of a way to restrict it to only PeerTube servers aside from using something like site:phijkchu.com to restrict the search on an instance-by-instance basis. But if you search and it’s on PeerTube, and Google has indexed it, it should come up there.

    Kagi also indexes videos, and lets lets one restrict the search by source of videos, with “PeerTube” being one.

    EDIT3: Adding “peertube” as a search term on Google Video isn’t ideal, but it did result in videos on PeerTube hosts at the top, so maybe that could be kind of an ad-hoc way of searching on Google Video.

    EDIT4: libera.site doesn’t appear to provide sortability, but it does list a video count per instance, as well as a bunch of other graphed data. Never seen it before now, though.

    https://libera.site/channel/peertube



  • The charge read: “On February 5 2023 you possessed an extreme pornographic image, which portrayed, in an explicit and realistic way, a person performing an act of intercourse with an animal, namely a fish, which was grossly offensive, disgusting or otherwise of an obscene character, and a reasonable person looking at the image would think that such a person or animal was real.”

    ^ Disgusting garbage of no cultural merit

    https://en.wikipedia.org/wiki/The_Dream_of_the_Fisherman's_Wife

    The work influenced later artists such as Félicien Rops, Auguste Rodin, Louis Aucoc, Fernand Khnopff and Pablo Picasso.[15] Picasso drew his own private version in 1903, which was displayed in a 2009 Museu Picasso exhibit titled Secret Images, alongside 26 other drawings and engravings by Picasso, displayed next to Hokusai’s original and 16 other Japanese prints, portraying the influence of 19th century Japanese art on Picasso’s work.[16] Picasso also later fully painted works that were directly influenced by the woodblock print, such as 1932’s Reclining Nude, where the woman in pleasure is also the octopus, capable of pleasuring herself.[17][18]

    ^ Influential classic work


  • Kids and their chats today have it easy, man.

    https://home.nps.gov/people/hettie-ogle.htm

    Hettie moved to Johnstown on 1869 to manage the Western Union telegraph office where she was employed on the day of the flood. Her residence was 110 Washington Street, next to the Cambria County Library. This also served as the Western Union office. Unlike many other telegraph operators associated with messaging on the day of the flood, Hettie was not employed by the Pennsylvania Railroad. She was a commercial operator. Three women were employed by Hettie; Grace Garman, Mary Jane Waktins and her daughter Minnie. They all died in the flood including Hettie.

    A timeline of Hettie’s activity on May 31, 1889:
    7:44 a.m. -She sent a river reading. The water level was 14 feet.
    10:44 a.m. -The river level was 20 feet.
    11:00 a.m. -She wired the following message to Pittsburgh. “Rain gauge carried away.”
    12:30 p.m. -She wired “Water higher than ever known. Can’t give exact measurement” to Pittsburgh.
    1:00 p.m. -Hettie moved to the second floor of her home due to the rising water.
    3:00 p.m. -Hettie alerted Pittsburgh about the dam after receiving a warning from South Fork that the dam “may possibly go.” She wired “this is my last message.” The water was grounding her wires. A piece of sheet music titled “My Last Message” was published after the flood.

    Hettie’s house on Washington Street was struck by the flood wave shortly after 4:00 p.m.

    https://en.wikipedia.org/wiki/Halifax_Explosion

    The death toll could have been worse had it not been for the self-sacrifice of an Intercolonial Railway dispatcher, Patrick Vincent (Vince) Coleman, operating at the railyard about 230 metres (750 ft) from Pier 6, where the explosion occurred. He and his co-worker, William Lovett, learned of the dangerous cargo aboard the burning Mont-Blanc from a sailor and began to flee. Coleman remembered that an incoming passenger train from Saint John, New Brunswick, was due to arrive at the railyard within minutes. He returned to his post alone and continued to send out urgent telegraph messages to stop the train. Several variations of the message have been reported, among them this from the Maritime Museum of the Atlantic: “Hold up the train. Ammunition ship afire in harbor making for Pier 6 and will explode. Guess this will be my last message. Good-bye boys.” Coleman’s message was responsible for bringing all incoming trains around Halifax to a halt. It was heard by other stations all along the Intercolonial Railway, helping railway officials to respond immediately.[71][72] Passenger Train No. 10, the overnight train from Saint John, is believed to have heeded the warning and stopped a safe distance from the blast at Rockingham, saving the lives of about 300 railway passengers. Coleman was killed at his post.[71]