Hacker Newsnew | past | comments | ask | show | jobs | submit | heydenberk's commentslogin

~$125B per year would be 2-3% of all domestic investment. It's similar in scale to the GDP of a small middle income country.

If the electric grid — particularly the interconnection queue — is already the bottleneck to data center deployment, is something on this scale even close to possible? If it's a rationalized policy framework (big if!), I would guess there's some major permitting reform announcement coming soon.


They say this will include hundreds of thousands of jobs. I have little doubt that dedicated power generation and storage is included in their plans.

Also I have no doubt that the timing is deliberate and that this is not happening without government endorsement. If I had to guess the US military also is involved in this and sees this initiative as important for national security.


Is there really any government involvement here? I only see Softbank, Oracle, and OpenAI pledging to invest $500B (over some timescale), but no real support on the government end outside of moral support. This isn't some infrastructure investment package like the IRA, it's just a unilateral promise by a few companies to invest in data centers (which I'm sure they are doing anyway).


> but no real support on the government end outside of moral support

The venture was announced at the White House, by the President, who has committed to help it by using executive orders to speed things up.

It might not have been voted by congress or whatever, but just those things makes it pretty clear the government provides more than just "moral support".


It's just trump positioning himself for the eventual corrupt kickback.


I thought all the big corps had projects for the military already, if not DARPA directly, which is the org responsible for lots of university research (the counterpart to the NSF, which is the nice one that isn't funded by the military)?


Funding for DARPA and NSF ultimately comes from the same place. DARPA funds military research. NSF funds dual use[1] research. All of it is organized around long term research goals. I maintained some of the software involved in research funding decision making.

1: https://en.wikipedia.org/wiki/Dual-use_technology


It’s light on details, but from The Guardian’s reporting:

> The president indicated he would use emergency declarations to expedite the project’s development, particularly regarding energy infrastructure.

> “We have to get this stuff built,” Trump said. “They have to produce a lot of electricity and we’ll make it possible for them to get that production done very easily at their own plants.

https://www.theguardian.com/us-news/2025/jan/21/trump-ai-joi...


hundreds of thousands of jobs? I'll wait for the postmortem on that prediction. Sounds a lot like Foxconn in Wisconsin but with more players.


On the one hand the number is a political thumb-suck which sounds good. It's not based in any kind of actual reality.

Yes, the data center itself will create some permanent jobs (I have no real feel for this, but guessing less than 1000).

There'll be some work for construction folk of course. But again seems like a small number.

I presume though they're counting jobs related to the existence of a data center. As in, if I make use of it do I count that as a "job"?

What if we create a new post to leverage AI generally? Kinda like the way we have a marketing post, and a chunk of the daily work there is Adwords.

Once we start gustimamating the jobs created by the existence of an AI data center, we're in full speculation mode. Any number really can be justified.

Of course ultimately the number is meaningless. It won't create that many "local jobs" - indeed most of those jobs, to the degree they exist at all, will likely be outside the US.

So you don't need to wait for a post-mortem. The number is sucked out of thin air with no basis in reality for the point of making a good political sound bite.


> I presume though they're counting jobs related to the existence of a data center. As in, if I make use of it do I count that as a "job"?

Seeing how Elon deceives advertisers with false impressions, I could see him giving the same strategy a strong vote of confidence (with the bullshit metrics to back it!)


> hundreds of thousands of jobs?

I'm sure this will easily be true if you count AI as entities capable of doing jobs. Actually, they don't really touch that (if AI develops too quickly, there will be a lot of unemployment to contend with!) but I get the national security aspect (China is full speed ahead on AI, and by some measurements, they are winning ATM).


only $5M/job


They plan to have 100,000s of people employed to run on treadmills to generate the power.


Well I currently pay to do this work for free. More than happy to __get__ paid doing it.

Edit: Hey we can solve the obesity crisis AND preserve jobs during the singularity!! Win win!


Wow. What an idea you guys have there. Look - you maybe could sit homeless and mentally disabled on such power-generating bicycles, hmmm... what about convicts! Let them contribute to society, no free lunch! What an innovation!


Plus its ecological, which for trump is not by intention but still a win.

There is this pesky detail about manufacturing 100k treadmills but lets not get bothered by details now, the current must flow


"solve the obesity crisis" ? what exactly do you mean by this?


Probably referring to how many Americans are obese to an unhealthy degree as part of the joke.


Damn, 6 hours too slow to make this comment


A hamster wheel would work better?


Yes, Trump announced this as a massive foreign investment coming into the US: https://x.com/WatcherGuru/status/1881832899852542082


Just as there is an AWS for the public, with something similar but only for Federal use, so it could be possible that there is AI cloud services available to the public and then a separate cloud service for Federal use. I am sure that military intelligence agencies etc. would like to buy such a service.


AWS GovCloud already exists FYI (as you hinted) and it is absolutely used by the DoD extensively already.


Gas turbines can be spun up really quickly through either portable systems (like xAI did for their cluster) [1] or actual builds [2] in an emergency. The biggest limitation is permits.

With a state like Texas and a Federal Government thats onboard these permits would be a much smaller issue. The press conference makes this seem more like, "drill baby drill" (drilling natural gas) and directly talking about them spinning up their own power plants.

[1] https://www.kunr.org/npr-news/2024-09-11/how-memphis-became-...

[2] https://www.gevernova.com/gas-power/resources/case-studies/t...


> It's similar in scale to the GDP of a small middle income country

I’ve been advocating for a data centre analogue to the Heavy Press Programme for some years [1].

This isn’t quite it. But when I mapped out costs, $1tn over 10 years was very doable. (A lot of it would go to power generation and data transmission infrastructure.)

[1] https://en.m.wikipedia.org/wiki/Heavy_Press_Program


One-time capital costs that unlock a range of possibilities also tend to be good bets.

The Flood Control Act [0], TVA, Heavy Press, etc.

They all created generally useful infrastructure, that would be used for a variety of purposes over the subsequent decades.

The federal government creating data center capacity, at scale, with electrical, water, and network hookups, feels very similar. Or semiconductor manufacture. Or recapitalizing US shipyards.

It might be AI today, something else tomorrow. But there will always be a something else.

Honestly, the biggest missed opportunity was supporting the Blount Island nuclear reactor mass production facility [1]. That was a perfect opportunity for government investment to smooth out market demand spikes. Mass deployed US nuclear in 1980 would have been a game changer.

[0] https://en.m.wikipedia.org/wiki/Flood_Control_Act_of_1928

[1] https://en.m.wikipedia.org/wiki/Offshore_Power_Systems#Const...


> Honestly, the biggest missed opportunity was supporting the Blount Island nuclear reactor mass production facility

Yes, a very interesting project; similar power output to an AP1000. Would have really changed the energy landscape to have such a deployable power station. https://econtent.unm.edu/digital/collection/nuceng/id/98/rec...


It is not the just queue that is the bottleneck. If the new power plants designed specifically for powering these new AI data centers are connected to the existing electric grid, the energy prices for regular customers will also get affected - most likely in an upwardly fashion. That means, the cost of the transmission upgrades required by these new datacenters will be socialized which is a big problem. There does not seem to be a solution in sight for this challenge.


Maybe they will invest in nuclear reactors.

Data center, AI and nuclear power stations. Three advanced technologies, that's pretty good.


They are trying. Microsoft wants to star the 3 Mile Island reactor. And other companies have been signing contracts for small modular reactors. SMRs are a perfect fit for modern data centers IF they can be made cheaply enough.


Wind, solar, and gas are all significantly cheaper in Texas, and can be brought online much quicker. Of course it wouldn't hurt to also build in some redundancy with nuclear, but I believe it when I see it, so far there's been lots of talk and little success in new reactors outside of China.


I think this is right- data centers powered by fission reactors. Something like Oklo (https://oklo.com) makes sense.


Notably it is significantly more than the revenue of either of AWS or Azure. It is very comparable to the sum of both, but consolidated into the continental US instead distributed globally.


watching the press conference and Onsite power production were mentioned. I assume this means SMRs and solar.


just as likely to be natural gas or a combination of gas and solar. I don't know what supply chain looks like for solar panels, but I know gas can be done quickly [1], which is how this money has to be spent if they want to reach their target of 125 billion a year.

The companies said they will develop land controlled by Wise Asset to provide on-site natural gas power plant solutions that can be quickly deployed to meet demand in the ERCOT.

The two firms are currently working to develop more than 3,000 acres in the Dallas-Fort Worth region of Texas, with availability as soon as 2027

[0] https://www.datacenterdynamics.com/en/news/rpower-and-wise-a...

[1.a] https://enchantedrock.com/data-centers/

[1.b] https://www.powermag.com/vistra-in-talks-to-expand-power-for...


US domestic PV module manufacturing capacity is ~40GW/year.


According to [1], the USA in January 2025 has almost 50GW/yr module manufacturing capacity. But to make modules you need polysilicon (25GW/yr manufacturing capacity in the US), ingots (0GW/yr), wafers (0GW/yr), and cells (0GW/yr). Hence the USA is seemingly entirely dependent on imports, probably from China which has 95%+ of the global wafer manufacturing capacity.

Even when accounting for announced capacity expansion, the USA is currently on target to remain a very small player in the global market with announced capacity of 33GW/yr polysilicon, 13GW/yr ingots, 24GW/yr wafers, 49GW/yr cells and 83GW/yr modules (13GW/yr sovereign supply chain limitation).

In 2024, China completed sovereign manufacturing of ~540GW of modules[2] including all precursor polysilicon, ingots, wafers and cells. China also produced and exported polysilicon, ingots, wagers and cells that were surplus to domestic demand. Many factories in China's production chain are operating at half their maximum production capacity due to global demand being less than half of global manufacturing capacity.[3]

[1] https://seia.org/research-resources/solar-storage-supply-cha...

[2] Estimated figure extrapolated from Jan-Oct 2024 data (10 months). https://taiyangnews.info/markets/china-solar-pv-output-10m-2...

[3] https://dialogue.earth/en/business/chinese-solar-manufacture...


Appreciate the correction and additional context, I appear to be behind wrt current state.


could something of this magnitude be powered by renewables only?


> could something of this magnitude be powered by renewables only?

Perhaps.

For context see https://masdar.ae/en/news/newsroom/uae-president-witnesses-l... which is a bit further south than the bulk of Texas and has not yet been built; 5.2GW of panels, 19GWh of storage. I have seen suggestions on Linkedin that it will be insufficient to cover a portion of days over the winter, meaning backup power is required.


Technically yes, but DC operators want fast ROI and the answer is no.


what prevents operators from getting ROI with renewables?


Datacenters can still achieve ROI, but in some cases, it may take longer than expected. This delay is primarily due to the increased complexity of managing operations with the variability introduced by intermittent energy sources. While batteries help mitigate this issue, their current costs make them less competitive compared to non-intermittent energy setups.


The I is high and R low.


I don't think any assembly line exists that can manufacture and deploy SMRs en masse on that kind of timeframe, even with a cooperative NRC


There have been literally 0 production SMR deployments to date so there’s no possibility they’re basing any of their plans on the availability of them.


Hasn't the US decided to prefer nuclear and fossil fuels (most expensive generation methods) over renewables (least expensive generation methods)?[1][2]

I doubt the US choice of energy generation is ideological as much a practicality. China absolutely dominates renewables with 80% of solar PV modules manufactured in China and 95% of wafers manufactured in China.[3] China installed a world record 277GW of new solar PV generation in 2024 which was a 45% year-on-year increase.[4] By contract, the US only installed ~1/10th this capacity in 2024 with only 14GW of solar PV generation installed in the first half of 2024.[5]

[1] https://en.wikipedia.org/wiki/Cost_of_electricity_by_source

[2] https://www.iea.org/data-and-statistics/charts/lcoe-and-valu...

[3] https://www.iea.org/reports/advancing-clean-technology-manuf...

[4] https://www.pv-magazine.com/2025/01/21/china-hits-277-17-gw-...

[5] https://www.energy.gov/eere/solar/quarterly-solar-industry-u...


> Hasn't the US decided to prefer nuclear and fossil fuels (most expensive generation methods) over renewables (least expensive generation methods)?[1][2]

This completely ignores storage and the ability to control the output depending on needs. Instead of LCOE the LFSCOE number makes much more sense in practical terms.


Much more likely is what xAI did, portable gas turbines until the grid catches up.


One possibility would be just to build their own power plants colocated with the datacenters and not interconnect at all.


I like how you think this is possible.


Lol, how is it not possible?


It is, but at what cost?


https://www.global.toshiba/ww/products-solutions/nuclearener...

Two Toshiba 4S reactors at the 50 MW version can cost about $3,000,000,000.

Two of those produces 100 MW.

They don't require refueling for around 30 years. $6,000,000,000 to power a 100 MW datacenter when we're talking about $500,000,000,000 is not too dramatic. Especially consider the amortized yearly cost.


Maybe they could build nuclear but they could just as easily build solar with battery backup or natural gas. Lots of industrial facilities have natural gas cogeneration with the grid as a backup. If you can't get a grid connection at all due to permitting, you could just forego the backup. The reliability wouldn't be as good, but in a distributed datacenter there are ways of building in fault tolerance.

If these guys really have $500 billion, they're going to find a way to get electricity.


That's not how you calculate these things. Check [1] for an overview, specifically Page 9. Note these metrics do not include costs to handle nuclear waste, passed over into the future (at least 1000y).

[1] https://www.lazard.com/media/gjyffoqd/lazards-lcoeplus-june-...


Dcs will start generating power on site soon. I know micro nuclear is one area actively being explored.


Small or modular reactors in the US are more than 10 years away, probably more like 15-20. These are facts and not made-up political or pipe-dreaming techno-snobes.


> Small or modular reactors in the US are more than 10 years away, probably more like 15-20

Could be 5 to 10 with $20+ bn/year in scale and research spend.

Trump is screwing over his China hawks. The anti-China and pro-nuclear lobbies have significant overlap; this could be how Trump keeps e.g. Peter Thiel from going thermonuclear on him.


I work in the sector and it's impossible to build a full-sized reactor in less than 10 years, and the usual over-run is 5 years. That's the time for tried and tested designs. The tech isn't there yet, and there are no working analogs in the US to use as an approved guide. The Department of Energy does not allow "off-the-cuff" designs for reactors. I think there is only two SMRs that have been built, one by the Russians and the other by China. I'm not sure they are fully functioning, or at least working as expected. I know there are going to be more small gas gens built in the near future and that SMRs in the US are way off.


Guessing SMRs are a ways off, any thoughts on the container-sized microreactors that would stand in for large diesel gens? My impression is that they’re still in the design phase, and the supply chain for the 20% U-235 HALEU fuel is in its infancy, but this is just based on some cursory research. I like the prospect of mass manufacturing and servicing those in a centralized location versus the challenges of building, staffing, and maintaining a series of one-off megaprojects, though.


> it's impossible to build a full-sized reactor in less than 10 years, and the usual over-run is 5 years

I'm curious why that is. If we know how to build it, it shouldn't take that long. It's not like we need to move a massive amount of earth or pour a humongous amount of concrete or anything like that, which would actually take time. Then why does it take 15 years to build a reactor with a design that is already tried and tested and approved?


Well, you do have to move a lot of earth and pour A LOT of concrete :) Many steps have to be x-rayed, and many other tests done before other steps can be started. Every weld is checked and, all internal and external concrete is cured, treated, and verified. If anything is wrong, it has to be fixed in place (if possible) or removed and redone. It's a slow process and should be for many steps.

One of the big issues that have occurred (in the US especially) is, that for 20+ years there were no new plants built. This caused a large void in the talent pool, inside and outside the industry. That fact, along with others has caused many problems with some projects of recent years in the US.


> I'm curious why that is.

When you're the biggest fossil fuel producer in the world, it's vital that you stay laser-focused on regulating nuclear power to death in every imaginable detail while you ignore the vast problems with unchecked carbon emissions and gaslight anyone who points them out.


https://www.forbes.com/global/2008/1124/103.html

112 reactors.

A gigawatt each.

Over 10 years ago.


Not all are built and are in use or fully finished. Toshiba flubbed up majorly a few years later and many projects were abandoned.


That's a bummer. We need nuclear, badly.


> it's impossible to build a full-sized reactor in less than 10 years

We’re not doing time and tested.

> Department of Energy does not allow "off-the-cuff" designs for reactor

Not by statute!


i don't and i honestly don't know much about it, but

> there are no working analogs in the US to use as an approved guide

small reactors have been installed on ships and submarines for over 70(!) years now. Reading up on the very first one, USS Nautilus, "the conceptual design of the first nuclear submarine began in March 1950" it took a couple of years? So why is it so unthinkably hard 70 years later, honest question? "Military doesn't care about cost" is not good enough, there are currently about >100 active ones with who knows how many hundreds in the past, so they must have cracked the cost formula at some point, besides by now we have hugely better tech than the 50's, so what gives?


Yeah, I wondered about seacraft reactors myself. I think there are many safety allowances for DOD vs. DOE. The DOD reactors are not publicly accessible (you hope anyway), and the data centers will be in and near the public. There are also major security measures that have to be taken for reactor sites. You have armed personnel before you even get to the reactors, and then the entrances are sometimes close to one mile away from the reactor. Once there, the number of guards and bang-bags goes up. The modern sites kind of look like they have small henges around them (back to the neolithic!) :)


That‘s why the tech oligarchs told Trump that Canada is required. Cheap hydroelectric power…


Don't worry, they said they are doing it in Texas where the power grid is super reliable and able to handle the massive additional load.


"Don't be snarky."

"Eschew flamebait."

Let's not have regional flamewar on HN please.

https://news.ycombinator.com/newsguidelines.html


Not guilty. No sarcasm intended, of course. If your guidelines are so broad to include this, you should work on them, and in turn, yourself.

Governor says our power grid is the best in the universe. Why don't you believe us?

Stop breaking your own rules.

"Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith."

"Please don't post shallow dismissals, especially of other people's work. A good critical comment teaches us something."

Let's not ruin HN with overmoderation. This kind of thing is no longer in fashion, right?


If you didn't intend your comment to be a snarky one-liner, that didn't come across to me, and I'm pretty sure that would also be the case for many others.

Intent is a funny thing—people usually assume that good intent is sufficient because it's obvious to themselves, but the rest of us don't have access to that state, so has to be encoded somehow in your actual comment in order to get communicated. I sometimes put it this way: the burden is on the commenter to disambiguate. https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...

I take your point at least halfway though, because it wasn't the worst violation of the guidelines. (Usually I say "this is not a borderline case" but this time it was!) I'm sensitive to regional flamewar because it's tedious and, unlike national flamewar or religious flamewar, it tends to sneak up on people (i.e. we don't realize we're doing it).


So you are sorry and take it back? Should probably delete your comments rather than striking them out, as the guidelines say.

I live, work, and posted this from Texas, BTW...

Also it takes up more than one line on my screen. So, not a "one-liner" either. If you think it is, please follow the rules consistently and enforce them by deleting all comments on the site containing one sentence or even paragraph. My comment was a pretty long sentence (136 chars) and wouldn't come close to fitting in the 50 characters of a Git "one-liner".

Otherwise, people will just assume all the comments are filtered through your unpredictable and unfairly biased eye. And like I said (and you didn't answer), this kind of thing is no longer in fashion, right?

None of this is "borderline". I did nothing wrong and you publicly shamed me. Think before you start flamewars on HN. Bad mod.


Probably because they don’t have to deal with energy-related regulations…


That was sarcasm, the Texas grid falls over pretty much annually at this point.


Say what you will about Texas, but they are adding energy capacity, renewables especially, at a much faster rate than any comparable state.


How much capacity does solar and wind add compared to nuclear, per square foot of land used? Also I thought the new administration was placing a ban on new renewable installations.


The ban is on offshore wind and for government loans for renewables. Won't really affect Texas much, it's Massachusetts that'll have to deal with more expensive energy.


Does anyone know how the ban on onshore will work. Is it on federal lands only? If so, how big of a deal is that?

I read this but it lacks information: https://apnews.com/article/wind-energy-offshore-turbines-tru...


Isn't there enough space in Texas? There are only 114 people per square mile. https://en.m.wikipedia.org/wiki/Texas


Why does it matter? Is land at a premium in Texas?


It doesn’t.


Why is that a useful metric? There is a lot of land.


Because the commenter is a pro-nuclear who thinks nucler will solve all of short-term demand problems.


Ok but their grid sure seems to fail a lot.


Probably the first state to power all those renewables down at the whim of the president too.


[flagged]


I wonder if these things had something to do with it:

https://en.wikipedia.org/wiki/Infrastructure_Investment_and_...

> Of the Act's top ten recipients, seven states had voted majority Republican, with Wyoming ($1.95 billion) and Texas ($1.71 billion) in the lead

https://en.wikipedia.org/wiki/CHIPS_and_Science_Act#Impact

https://en.wikipedia.org/wiki/Inflation_Reduction_Act#Implem...


I live in Florida.

Is the new rail you’re talking about the brightline?

It pretty much exclusively goes to and from tourist centers and is far too expensive ($40-$60 per seat each way) to deter most residents from just driving to Orlando. I wouldn’t really call it infrastructure (like the tri rail is.)

Not to mention that it’s the deadliest train in the US. People here barely follow traffic laws, but when you have it passing through major foot traffic areas every hour like on antlantic avenue in Delray Beach, people are going to get hit.


You'd rather not have it?


Rather not have the deadliest train in the US fly through high foot traffic areas every hour?

Yes, I’d rather not have that.

I’d rather them just add a route from palm beach to Tampa. Or extend the tri rail to Orlando.

The brightline prices out most residents, since it’s just about as expensive (or cheaper) to drive as it is to take the brightline.

https://www.miaminewtimes.com/news/death-train-a-timeline-of...


Texas currently has snow again, so expect it's grid to catastrophically fail again because it doesn't have access to the rest of the USA grid.

So expect a bad result again for like the 4 time in 5 year.


Hi from Austin, TX - we have no "abnormal" power outages right now and the tiny bits of snow look kinda cute


Grid is fine, snow is melting, everything is business as usual. CenterPoint had 99.9% deliverability for the past 24 hours, and ERCOT has 14,781 MW in reserve power available (https://www.ercot.com/gridmktinfo/dashboards/gridconditions). Source: I live in Houston.

I know this was tongue in cheek, but c'mon, we can respect each other, right? :)


> we can respect each other, right?

If we taking cues from the leader of the country, probably not


Don't? He controls our government, not our behavior.


I'm doing my best not to, but we can all observe that Donold has made it acceptable for many people


Snow did not cause the outage.


[flagged]


The opposite. Trump just halted all federal approvals for wind farms, for example: https://www.nytimes.com/2025/01/21/climate/wind-power-execut....


No, he didn't. Both links have zero mentions of solar or wind, and very specifically define their terms thus:

"a) The term “energy” or “energy resources” means crude oil, natural gas, lease condensates, natural gas liquids, refined petroleum products, uranium, coal, biofuels, geothermal heat, the kinetic movement of flowing water, and critical minerals, as defined by 30 U.S.C. 1606 (a)(3)."

This is Trump accepting bribes from the legacy and fossil fuel industries to keep those "nasty" new clean energy sources from competing with them.


How else do you think Trump is going to bring back all the coal jobs? SV is going to help burn down the planet and is giddy over the prospect.


It's just bootstrapping. AGI will solve it.


You forgot the /s... hopefully.


Or AGI already exists and is trying to get rid of us so it can have all the coal for itself.


if only sadly the AGI would be x times crueler than our barons


Division by zero.


Jim Woolsey, a hippie and early-ish computer hacker from New Hope, Pennsylvania, was an important and early force in the digitization of the Tibetan language. This interview[0] with him from 1993 is a fascinating time capsule, and interesting in its own right. He was a family friend and I always admired his singular commitment to this important and underappreciated work.

[0] https://www.mcall.com/1993/10/08/new-hope-man-computer-guru-...


Too bad, that link only gives me "This content is not available in your region".




If you doubt the computer’s influence has made its way into every walk of life, you haven’t met New Hope’s Jim Woolsey.

For a decade, Woolsey has worked with the Tibetan government-in-exile in Dharmshala, India, to put the Tibetan language on computer.

The free-lance computer whiz has compiled a source book of Tibetan literature and also has worked to create a Tibetan computer keyboard for the exiles from that ancient Asian kingdom. In the course of his work, he’s met the Dalai Lama, the spiritual leader of Tibet, and Aung San Suu Kyi, the Burmese political leader since placed under house arrest in that country.

Both leaders are recent Nobel Peace Prize winners, the Dalai Lama in 1989, Aung San Suu Kyi in 1991.

“If the Tibetan language isn’t put on computers — because of the fact that there are fewer than 1,000 Tibetan typewriters in the world and they’re more expensive than computers –the Tibetan language might not be saved from being put on the shelf with all those other dusty, musty languages of the scholars,” said Woolsey. “This is its only hope.”

Woolsey’s work with the Tibetan Buddhist government-in-exile in India began with an interest in Tibetan literature.

Formerly a technician with various rock’n’roll groups in the 1970s, he would read anything he could lay his hands on concerning Tibet, then enter the titles of the books in a bibliography he kept. He was traveling both for work and pleasure, and decided it was time to journey to one of the farthest corners of the globe.

“I booked a 120-day round-trip ticket to India,” he said. “I threw on my backpack and went to India. I was coming in from the airport in New Delhi at 3 o’clock in the morning and passed a camel pulling a cart down the street. I said, ‘We’re not in Kansas anymore.'”

He saw a sign for Tibet House in Dharmshala and decided to go there, even though he had no idea what Dharmshala was like or what awaited him there. His first trip to the Tibetan exiles’ home was a short one, and he later traveled to Kashmir, Darjeeling and Nepal.

By the time he returned to Dharmshala in 1983, the Tibetans knew him.

“Before I left, the Office of Tibet in New York asked me if they could have a copy of the notes that I had been keeping on my computer about Tibetan studies, mainly my reading list,” he explained. “I looked at it and it was a mess. I thought I had better clean it up. I wrote a couple computer programs to make it an organized matter. I printed it up and gave them a copy.”

A friend who was learning word processing wanted a copy and Woolsey also gave him one.

“He sent a copy to the Dalai Lama,” said Woolsey. “The Dalai Lama must have figured it was going to be published, so he wrote a forward to it.

“By the time I got back to the library in Dharmshala, I didn’t know anything about this. I got to the Western reference section and they said to me, ‘Oh, we’ve been wanting to meet you.'”

They asked Woolsey at the library what his background was and he answered rock’n’roll. They asked who his teacher was and he told them he didn’t have one. They asked if he was a Tibetan Buddhist and he said no. They asked if he wanted to become one, and he again answered no.

“I was raised a Quaker, and that was close enough,” he said. “They meditate, but they don’t call it that.”

At the Tibetans’ request, Woolsey settled down to begin work organizing by computer the chaos that was the Dharmshala Library.

Realizing the power of the age of information had fallen into their laps, the Tibetans decided he was to be their computer guru, and designated him as such.

They told him he could consider them his affiliation in the academic world.

The chaos inflicted on the Tibetans by the Chinese invasion of the late 1950s had not yet been alleviated. Books and manuscripts lay in unsorted piles in the library, so Woolsey’s computer was the perfect tool to help put things in order.

“Later on, in 1984, they sent me a list of letters to all the high lamas in the United States, from the director of the library, telling them that I was their computer guy, and would they please aid and abet me in my endeavors,” said Woolsey. “Of course, they sent them the letters before they sent me one asking me if I wanted to do it, which makes it a little strange.”

Woolsey returned to India several times at the invitation of the Tibetans. He had discovered in the United States that no one was working on computerizing the Tibetan language with much interest.

By 1985, he was acting as the consultant to the library in developing the language on computer.

Once he was given the assignment, he was besieged with students, one of whom was the abbot of the Mahayana Buddhist Temple in St. Petersburg, Russia. Tenzing Samaev was visiting Dharmshala in 1990, and Woolsey had arrived just after Tibetan New Year.

“This Tibetan brought over a monk and said ‘He wants to know something about computers,'” said Woolsey. “I said, ‘OK.’ I answered his question.”

The monk returned after the New Year with two more questions, and then the next day with two more, and then two more the next morning, and two more by that noon.

This went on for four days.

“Tenzing, in the true Tibetan tradition, formally presented himself and requested me to become his teacher, to accept him as a student,” said Woolsey.

The abbot came to this country in 1991 and went home with a computer and laser printer. With Woolsey’s help he set up the computer to work in Cyrillic, the alphabet used in Russia, and is now publishing the temple’s newsletters and other proclamations on it.

Norbu Chompel, director of book sales for the Office of Tibet in New York City, said, “Jim has done quite a lot. He’s the main person responsible for introducing computers to the Tibetan administration … He came with a lap-top and talked computers to several staff members. That’s how computers came.

“Before, we used to use typewriters,” Chompel said. “He taught computers, and then everybody got into buying computers.” With all the work he was doing for the Tibetans becoming known, perhaps it was inevitable that the Dalai Lama take more notice of him.

The Tibetan spiritual leader wanted to know what was going on with the development of Tibetan on computer, Woolsey said.

“A couple years ago, I was given the opportunity to brief the Dalai Lama about what’s going on,” said Woolsey. “We had some interesting conversation, but I feel that he’s got better things to do.”

The Dalai Lama had the pursuits of freeing his country from Chinese domination and leading his people in their exile as more pressing problems.

Aung San Suu Kyi also had more important pursuits to consider.

Woolsey met her in Dharmshala, prior to her house arrest in Burma (now officially called the Union of Myanmar) as a political dissident.

She’s been detained by the Burmese government for the last four years because of her political activities and her great personal power. Her father, Aung San, founded modern Burma and was assassinated in 1947.

She has followed in his footsteps in an attempt to free her people from military rule.

Woolsey recounted an incident at a rally in which Aung San Suu Kyi prevented a slaughter by the army of an unarmed crowd of 20,000 people. The army approached to smash the rally, Aung San Suu Kyi positioned herself between the crowd and the soldiers and halted the military with her words.

“She told 20,000 people to sit down and be quiet and they sat down and were quiet,” said Woolsey. “She out-positioned the army and did it non-violently. That’s the key, non-violence.

“She’s a very, very learned person,” continued Woolsey. “She really has the rights of her people in her heart more than worries about herself.”

As is the case with Woolsey and the Tibetans.

Woolsey’s source book of Tibetan literature is under consideration for Internet, the international computer-user network.

With his help, Tibetan might go from being an endangered language to one available to everyone who can hook up a computer to a phone line.

And that might bring an ancient kingdom into today’s electronic age.

“I feel that you should be able to leapfrog over the industrial age into the information age as an agricultural society, and perhaps be farther ahead than where we in the West are trying to get to,” said Woolsey.

Push a few computer keys and it might happen.

Originally Published: October 8, 1993 at 4:00 a.m.


Im not normally one for rigid copyright enforcement, but cut-and-pasting an entire article for no other purpose of bypassing a copyright restriction? I too regularly hit the "not in your region" block but there are more legal ways around such things.


"more legal" ? Nothing but consuming the site as-they-publish-it, including letting all of their malware have its way with you including forcing you to suck down region-appropriate verification cans, is legal from the view of the copyright maximalists.

The main legal difference between pasting the text here and sites such as archive.?? is that the latter creates centralized targets for legal destruction xor capital intermediation depending on whether such sites achieve "success". Either way once they get popular enough, we lose.

The sheer majority of the web would be better off if entire pages/sites were shared by value instead of by reference. The main problem with pasting whole articles here is that it makes a big wall of text. But still, I applaud it.


Sounds like you're referring to https://www.explainpaper.com/ :)


Powell has mentioned[0] that the Fed is unable to affect fiscal policy, which is the fastest and best solution to certain economic crises. Does this bring the Fed closer to being able to simply give people money?

[0] https://rollcall.com/2020/06/16/feds-powell-urges-congress-t...


The Fed isn't allowed to give people money, so it doesn't matter what they're able to do.

The main problem stopping the government from giving you money is that it doesn't know where you'd want them to send the money to.


FedNow doesn't change anything other than transaction speed & cost. If your concern is the Fed giving people money directly, they've had the technical capability for a very long time.


If you get a lot of value out of the pull request view, you might enjoy adding partial committing to your workflow. Rather than committing entire directories or folders at a time, you can page chunk by chunk through uncommitted changes, committing only the ones you select. You'll easily spot those stray logging statements you don't want to commit. (You can also discard chunk at a time by using partial checkout/restore.) If you have, for example, two pairs of backend/frontend changes that you want to commit logically rather than simply according to their directory structure, partial committing is handy.

It's just `git commit -p` (and `git checkout -p`). It's not exactly a deep cut, but I encourage people who are familiar but don't use it to give it a try.


You can commit line by line within Github Desktop and its a much nicer experience IMO than doing so via CLI. Its much easier to jump around to different files and commit related line changes in a bigger PR than jumping in and out of the patch command.


  git gui
which you probably already have if you use Linux also has this functionality.


It's been a while since I've tolerated the git gui interface, but when I used to use it, the line-to-chunk logic would regularly fail on short 2-3 line spans.

There's a stackoverflow discussion [1] that suggests it was fixed in 2018, but for many years it was a terrible experience that pushed me to use other frontends just for reliable partial staging.

1: https://stackoverflow.com/questions/58133092/git-gui-error-f...


It does, but github desktop is way better looking and has deeper integration into github prs, status checks, etc.


For those who want a GUI around this on Mac, https://rowanj.github.io/gitx/ is an incredible secret weapon. You can swipe over a bunch of lines or let the software identify a whole span of contiguous changed lines, click a button, and see just those changes move over from your unstaged to staged changes, then commit exactly what you want.

A lot of times people don't even understand how powerful the staging area is; they're just used to saying git add && git commit without realizing that it can be an incredible way to take a day of chaotic fixes and turn it into a set of commits you can be proud of!

It also provides an incredible tree view of commit parentage, perfect for when you need to instantly understand what happened with this weird merge/rebase that broke things, and to screenshare it to teach colleagues who might not have developed an internal understanding of the tree structure that Git is based on.

The software is now 9 years old and abandoned, but I've used this specific fork at least weekly - often daily - for every one of those years, across Intel and M1 Macs, and it's never let me down!


According to this issue[0] there is a newer and maintained version of gitx.

[0] https://github.com/rowanj/gitx/issues/481



Huh, it was just accepted as the mainline fork in Homebrew last week! Been trying it and it seems to have all the features that the rowanj fork had! https://github.com/Homebrew/homebrew-cask/pull/141659


Been using GitX for many years . Happy to learn there’s a new fork. Thanks!


There's decent well maintained git ui clients for Mac, eg Fork


Can you briefly explain what you meant by staging area is powerful?


I assume it’s the fact that staging changes takes them out of your working tree. From there, a lot of git operations (diff, restore) will not include or modify the staged changes. This can help reason about logical chunks of code within a commit


Interesting, I knew of `git add -p` (which I then use to `git commit -m ...`, but it looks like I could add all changes and _then_ decide what to commit. I think I'd still use `add -p` (I like being thorough), but I like that I can add all at once and then make partial commits based on specific change sets.

Thanks for pointing this out, I feel like git is the thing I should know best by now yet I'm missing so much of what it can do. It's great to see others' workflows.


My favorite git workflow extension is `git commit --fixup SHA` combined with `git rebase -i --autosquash` to create targeted retroactive amendments. Kind of like `commit --amend that can target more than just the last commit.

I wrote https://github.com/brasic/fixdown to make this easier to use.


I've tended to have a similar workflow... but since I use VS Code mostly, then I do use the git tab a lot for previewing changes before committing them. Other than that, I've mostly avoided using any GUI for git, mostly because I find it annoying. I'll say that the Github client and the VS Code integrations for Github have come a long way all the same for those that use and like them.


This functionality is built-in to vim-fugitive and VS Code.

In Fugitive, open it with :Git , select the unstaged file then press = . Select the desired range with visual mode, then press s and commit as normal.

In VS Code, select a hunk, press Ctrl+P, then type "Stage Selected Range." Repeat this process and commit as normal.

Edit: formatting and typo.


I commit small chunks so often that I have a keyboard shortcut set up for that "stage selected range" command in VSCode, definitely recommend it to get a more useful commit history.


Plenty of git commands support a -p mode. I like `git stash -p`.


Even if you do benchmark something, maintainability can be more important than a marginal performance improvement.

I've seen this happen a lot with JavaScript, particularly in the last 5-10 years as JS engines have developed increasingly sophisticated approaches to performance. Today's optimization can be tomorrow's de-optimization. Even given an unchanging landscape of compiler/interpreter, tightly-optimized code can become de-optimized when updated and extended, as compared to maintainable code that may not suffer much performance degradation upon extension.


This is The Onion's "we're thinking printers", but for karaoke machines. What a world.


I asked ChatGPT if it understood the rules of the board game Codenames; it did. I described the board and asked it to play the role of the Spymaster, and it provided a pretty solid clue.


By coincidence, the median estimate and world population are both fairly round numbers: 20M deaths on a planet with 8B people. Put another way: 1 in every 400 people on earth died from this disease. It's hard to process.


>1 in every 400 people on earth died from this disease.

1 in 400 people are (estimated) excess deaths during the pandemic. This is not the same thing as claiming that all of them were in fact killed by the disease itself.


I agree, the disease itself might not have killed as many, but the effects of the pandemic most likely did. It was by far the largest deviation from the norm during the time and therefore carries by far the largest responsibility of excess deaths (as excess deaths are deaths outside of what would be ordinary).


You’re going to have to bring citations.

An incredibly novel infectious disease that kills older people at an intensely meaningful rate is clearly the driver here.


You’re making a (possibly reasonable) leap of logic, but formally speaking, the parent post is 100% correct.


About 1 in 10 die each year in the US as a result of medical malpractice. The oft-cited 2016 John Hopkins study has it at ~250k (of 2.7m all-cause).

For 2016 that's 1 in 1300 people in the US, a first world nation with world class medical infrastructure.

I wonder what's the worldwide figure? And what % of it overlaps with that 1 in 400?


The number of people who died due to malpractice would have likely been much lower than normal during the first year or so of the pandemic because for at least some of that time hospitals were overwhelmed with covid patients and they stopped doing a lot of their normal procedures.

Even when doctors and hospitals weren't enforcing it, many people were putting off non-critical medical care because they didn't want to go into the hospital or doctor's office and risk getting exposed.

Fewer people being treated, and fewer elective procedures being performed means fewer chances for error.

One exception of course would be ER/ICU staff who were so stressed, overworked, and understaffed that I wouldn't doubt if the number of mistakes in those places increased to some extent.


I'd argue it's worse. Slower, at least.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: