• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Grid Storage Systems for Renewable Energy - Technology and Projects (No Politics)

Status
Not open for further replies.

blueone

Major Contributor
Forum Donor
Joined
May 11, 2019
Messages
1,195
Likes
1,547
Location
USA
Yeah, coal may be worse; burning garbage in one’s front yard is even worse. But the facts about peaker plants stand.
No they don't. Peaker plants almost exclusively use gas turbine engines, which are cleaner than any other fossil-fueled power source. No, they're not as clean as solar and wind, but they emit 40-50% less CO2 than coal, and no mercury, sulfur, or particulates. And they can produce power indefinitely, unlike any proposed battery peak-handling alternatives. No other alternative can be turned on and off in a short time, and no other practical alternative can run 24/7 as needed. And they're hydrogen-ready.
 

Doodski

Grand Contributor
Forum Donor
Joined
Dec 9, 2019
Messages
21,631
Likes
21,906
Location
Canada
Looks like a long slog for Hydrogen, but the EU is betting on it.
Same here in Alberta. "Air Products' Alberta Net-Zero Hydrogen Complex: This $1.3 billion auto-thermal hydrogen production facility based in Edmonton is expected to be on-stream in 2024 producing a natural-gas based hydrogen."
 
OP
MediumRare

MediumRare

Major Contributor
Forum Donor
Joined
Sep 17, 2019
Messages
1,956
Likes
2,283
Location
Chicago
No they don't. Peaker plants almost exclusively use gas turbine engines, which are cleaner than any other fossil-fueled power source. No, they're not as clean as solar and wind, but they emit 40-50% less CO2 than coal, and no mercury, sulfur, or particulates. And they can produce power indefinitely, unlike any proposed battery peak-handling alternatives. No other alternative can be turned on and off in a short time, and no other practical alternative can run 24/7 as needed. And they're hydrogen-ready.
 

Travis

Senior Member
Joined
Jul 9, 2018
Messages
455
Likes
552
SEMATECH facilitated industry standards. That's about it.
You are way off base here, not even close to reality. SEMATECH was a client, a former CEO was a client, I have had numerous tours of that Ausin facility. Anyone who was around chip manufacturing in that timeframe, late 80s to early 90s, wouldn't agree with that at all. You may have looked up something more recently, but it's apples and oranges, it has all been moved out to Albany. They built a massive plant, with a large-scale clean room in Austin, they had 400 employees, about half of which were sent from the consortium members for two years. Those were engineers who were charged with catching up with the capabilities of the Japanese manufacturers by increasing wafer size and speeding up the miniaturization process. At the end of the five-year funding period, they increased the wafer size, and reduced the miniaturization time from 3 years to 2 years.

Search SEMATECH and patents, there is probably more than 100, all having to do with chip manufacturing. I don't even think they developed any standards in the technical sense, but who knows. They were all about making bigger waffers, and thinner ones (doubles memory).

"Establishment of SENATECH Legislation to form a Semiconductor Manufacturing Technology (SEMATECH) facility was approved as part of Public Law 100-180 in the 1988 Defense Appropriations Bill. SEMATECH was incorporated on August 7, 1987 as a Department of Defense (DOD)/Industry partnership. Its focus was to develop world-leading semiconductor manufacturing within the United States. The initial organization consisted of 14 semiconductor manufacturing companies"

"Members of the Consortium recognized the United States' weakness was in manufacturing technology, not in basic semiconductor design or research. Therefore, they established an overall goal to match Japan's manufacturing capabilities by 1993." [The Consortium included, Intel, TI, Motorola, National Semiconductor]

"Technology Development: SEMATECH has established and demonstrated baseline processes that have increased chip production by four fold on single silicon wafers. This was achieved solely through the use of American manufacturing equipment. Simultaneously, SEMATECH has achieved greater conductor density. The initial baseline was established using 0.8 micron and 0.65 micron geometries on 100 mm wafers. Transitioning through 0.50 microns on 150 mm wafers, they are now able to manufacture chips with 0.35 micron conductors on 200 mm wafers.29 This represents parity with Japan's best semiconductor technology. In 1993, U.S. manufacturers are expected to surpass Japan through the introduction of a 0.25 micron manufacturing capability that has already been demonstrated by SEMATECH in low production runs. Further developments in the next few years are expected to reduce conductor size to 0.18 microns."

The subsidies continue, as you mentioned, now CHIPS (which is modeled off of SEMATECH).

It just seems as though you are grasping at trying to support an assertion (subsidies are bad, the internet wasn't subsidized) when the reality is that all sorts of industry in the US is subsidized, either initially, midstream, sometimes continuously, either directly with cash, tax/investment credits or indirectly like general tax provisions.
 

levimax

Major Contributor
Joined
Dec 28, 2018
Messages
2,399
Likes
3,528
Location
San Diego
Looks like a long slog for Hydrogen, but the EU is betting on it.
Hydrogen certainly seems like the "dark horse" in this race but Toyota has stubbornly continued to work on both direct burning hydrogen engines and hydrogen fuel cell vehicles. They see the direct burning engines as a bridge technology to either full electric or hyrdrogen fuel cell electric cars. As much as it makes sense "on paper" to just go all battery electric vehicles it really isn't practical with existing infrastructure and it is going to take some time and mix of "transition" technologies with the final outcome being "electric cars" (80%+ electric motor efficency can't be beat) but the fuel cell vs battery vs capacitor vs some other unknow technology is going to be intersting to watch. I hope wise policies are put in place to allow for competing technoliges so in the end the consumer and the enviorment end up the winner.
 

Timcognito

Major Contributor
Forum Donor
Joined
Jun 28, 2021
Messages
3,566
Likes
13,367
Location
NorCal
Hydrogen certainly seems like the "dark horse" in this race but Toyota has stubbornly continued to work on both direct burning hydrogen engines and hydrogen fuel cell vehicles. They see the direct burning engines as a bridge technology to either full electric or hyrdrogen fuel cell electric cars. As much as it makes sense "on paper" to just go all battery electric vehicles it really isn't practical with existing infrastructure and it is going to take some time and mix of "transition" technologies with the final outcome being "electric cars" (80%+ electric motor efficency can't be beat) but the fuel cell vs battery vs capacitor vs some other unknow technology is going to be intersting to watch. I hope wise policies are put in place to allow for competing technoliges so in the end the consumer and the enviorment end up the winner.
Agree. It also seems like less distributed sources of hydrogen like solar combined night time power plants, large aircraft and rail transport might be in the cards.
 
OP
MediumRare

MediumRare

Major Contributor
Forum Donor
Joined
Sep 17, 2019
Messages
1,956
Likes
2,283
Location
Chicago
Hydrogen certainly seems like the "dark horse" in this race but Toyota has stubbornly continued to work on both direct burning hydrogen engines and hydrogen fuel cell vehicles. They see the direct burning engines as a bridge technology to either full electric or hyrdrogen fuel cell electric cars. As much as it makes sense "on paper" to just go all battery electric vehicles it really isn't practical with existing infrastructure and it is going to take some time and mix of "transition" technologies with the final outcome being "electric cars" (80%+ electric motor efficency can't be beat) but the fuel cell vs battery vs capacitor vs some other unknow technology is going to be intersting to watch. I hope wise policies are put in place to allow for competing technoliges so in the end the consumer and the enviorment end up the winner.
Sadly, hydrogen as a fuel is mostly a sick joke. First, production requires vast amounts of excess electricity, second, compression of hydrogen to -423 deg wastes 30% of that energy, third, distribution infrastructure doesn’t exist, fourth, batteries (as they evolve) are entirely sufficient and economical for light vehicles. Green Ammonia and other synthetic fuels will cover heavy applications. Liquid H2 is a true industry-led government boondoggle, not solar & wind.
 

blueone

Major Contributor
Forum Donor
Joined
May 11, 2019
Messages
1,195
Likes
1,547
Location
USA
I can't believe you fall for this baloney. Modern gas peaker plants using turbines do not emit particulates. Yet what does the reference focus on, PM2.5.
 

Timcognito

Major Contributor
Forum Donor
Joined
Jun 28, 2021
Messages
3,566
Likes
13,367
Location
NorCal
Liquid H2 is a true industry-led government boondoggle, not solar & wind.
Yes but there is excess solar in the day and if a city needs the power only at night H2 may be the answer. Its going to take a lot of batteries to power Chicago. transportation is a big bugaboo for H2.
 

Travis

Senior Member
Joined
Jul 9, 2018
Messages
455
Likes
552
I can't find a single reference to support your claim of government funding for what the long-distance carriers did
Because you have expectation bias, you assume it doesn't exist because it doesn't fit your premise, thus you probably didn't really look.

This took 2 seconds to find using "internet" "subsidies" and "history".


A quote from that article (the emphasis in bold is mine, as well as the comments in brackets):

NSFNET went online in 1986 and connected the supercomputer centers at 56,000 bits per second—the speed of a typical computer modem today. In a short time, the network became congested and, by 1988, its links were upgraded to 1.5 megabits per second. A variety of regional research and education networks, supported in part by NSF, were connected to the NSFNET backbone, thus extending the Internet’s reach throughout the United States.

Creation of NSFNET was an intellectual leap. It was the first large-scale implementation of Internet technologies in a complex environment of many independently operated networks. NSFNET forced the Internet community to iron out technical issues arising from the rapidly increasing number of computers and address many practical details of operations, management and conformance.

Throughout its existence, NSFNET carried, at no cost to institutions, any U.S. research and education traffic that could reach it. At the same time, the number of Internet-connected computers grew from 2,000 in 1985 to more than 2 million in 1993. To handle the increasing data traffic, the NSFNET backbone became the first national 45-megabits-per-second Internet network in 1991.

The history of NSFNET and NSF's supercomputing centers also overlapped with the rise of personal computers and the launch of the World Wide Web in 1991 by Tim Berners-Lee and colleagues at CERN, the European Organisation for Nuclear Research, in Geneva, Switzerland. The NSF centers developed many tools for organizing, locating and navigating through information, including one of the first widely used Web server applications. But perhaps the most spectacular success was Mosaic, the first freely available Web browser to allow Web pages to include both graphics and text, which was developed in 1993 by students and staff working at the NSF-supported National Center for Supercomputing Applications (NCSA) at the University of Illinois, Urbana-Champaign. In less than 18 months, NCSA Mosaic became the Web "browser of choice" for more than a million users and set off an exponential growth in the number of Web servers as well as Web surfers. Mosaic was the progenitor of modern browsers such as Microsoft Internet Explorer and Netscape Navigator.

Privatization: 1993-1998.

Commercial firms noted the popularity and effectiveness of the growing Internet and built their own networks. The proliferation of private suppliers led to an NSF solicitation in 1993 that outlined a new Internet architecture that largely remains in place today.

From that solicitation, NSF awarded contracts in 1995 for three network access points, to provide connection points between commercial networks, and one routing arbiter, to ensure an orderly exchange of traffic across the Internet. In addition, NSF signed a cooperative agreement to establish the next-generation very-high-performance Backbone Network Service. A more prominent milestone was the decommissioning of the NSFNET backbone in April 1995.

[If you wish to dig deeper, you will find these contracts went to IBM, MCI and a couple of others. IBM eventually sold there internet infrastructure to AT&T which had finally seen the light. ]

In the years following NSFNET, NSF helped navigate the road to a self-governing and commercially viable Internet during a period of remarkable growth. The most visible, and most contentious, component of the Internet transition was the registration of domain names. Domain name registration associates a human-readable character string (such as “nsf.gov”) with Internet Protocol (IP) addresses, which computers use to locate one another.

The Department of Defense funded early registration efforts because most registrants were military users and awardees. By the early 1990s, academic institutions comprised the majority of new registrations, so the Federal Networking Council (a group of government agencies involved in networking) asked NSF to assume responsibility for non-military Internet registration. When NSF awarded a five-year agreement for this service to Network Solutions, Inc. (NSI), in 1993, there were 7,500 domain names.

In September 1995, as the demand for Internet registration became largely commercial (97 percent) and grew by orders of magnitude, the NSF authorized NSI to charge a fee for domain name registration. Previously, NSF had subsidized the cost of registering all domain names. At that time, there were 120,000 registered domain names. In September 1998, when NSF’s agreement with NSI expired, the number of registered domain names had passed 2 million.

The year 1998 marked the end of NSF’s direct role in the Internet. That year, the network access points and routing arbiter functions were transferred to the commercial sector
. And after much debate, the Department of Commerce’s National Telecommunications and Information Administration formalized an agreement with the non-profit Internet Corporation for Assigned Numbers and Names (ICANN) for oversight of domain name registration. Today, anyone can register a domain name through a number of ICANN-accredited registrars.

NSF after NSFNET.

The decommissioning of NSFNET and privatization of the Internet did not mark the end of NSF’s involvement in networking. NSF continues to support many research projects to develop new networking tools, educational uses of the Internet and network-based applications.

[End of quote]

Generally, this is how every utility in the US was started, or modernized. Power, telephone, hydroelectric infrastructure, even the transcontinental railroad. You can research any specific aspect of the transition you wish. You will find that Vinton was at the center of the transition, including working at MCI on their email system.

I have forgotten that the entire thing was government funded, top to bottom, including the software. It was built, in place (the "backbone") when it was turned over. Did commercials ISPs expand it, of course, they did.

So was the subsidization to create the internet a good thing, bad thing? I don't know, I think all things being equal, seems like a success. A science-based agency managed it's development so that all would be able to connect and communicate, jump-started it with the installation of a turnkey nationwide system, and turned it over to private enterprise to exploit it (in the good sense)? Not only was it subsidized, it was entirely built from taxpayer money. After privatization, it was expanded Now we are back to "Municipal Broadband" models where entire cities have internet access 100% with taxpayer money.
 
Last edited:

levimax

Major Contributor
Joined
Dec 28, 2018
Messages
2,399
Likes
3,528
Location
San Diego
Sadly, hydrogen as a fuel is mostly a sick joke. First, production requires vast amounts of excess electricity, second, compression of hydrogen to -423 deg wastes 30% of that energy, third, distribution infrastructure doesn’t exist, fourth, batteries (as they evolve) are entirely sufficient and economical for light vehicles. Green Ammonia and other synthetic fuels will cover heavy applications. Liquid H2 is a true industry-led government boondoggle, not solar & wind.
If what you say is true and Hydrogen is allowed to compete fairly with the alternatives it will fail and Toyota and other investors in the technology will lose and the consumer and enviroment will win. I agree governements should not back Hydrogen but where we probably disagree is that I don't think they should back anything. Set some wise standards / emmission goals (this is the hard part obviously) and let the competition sort out the winners and losers.
 

blueone

Major Contributor
Forum Donor
Joined
May 11, 2019
Messages
1,195
Likes
1,547
Location
USA
You are way off base here, not even close to reality. SEMATECH was a client, a former CEO was a client, I have had numerous tours of that Ausin facility. Anyone who was around chip manufacturing in that timeframe, late 80s to early 90s, wouldn't agree with that at all. You may have looked up something more recently, but it's apples and oranges, it has all been moved out to Albany. They built a massive plant, with a large-scale clean room in Austin, they had 400 employees, about half of which were sent from the consortium members for two years. Those were engineers who were charged with catching up with the capabilities of the Japanese manufacturers by increasing wafer size and speeding up the miniaturization process. At the end of the five-year funding period, they increased the wafer size, and reduced the miniaturization time from 3 years to 2 years.

Search SEMATECH and patents, there is probably more than 100, all having to do with chip manufacturing. I don't even think they developed any standards in the technical sense, but who knows. They were all about making bigger waffers, and thinner ones (doubles memory).

"Establishment of SENATECH Legislation to form a Semiconductor Manufacturing Technology (SEMATECH) facility was approved as part of Public Law 100-180 in the 1988 Defense Appropriations Bill. SEMATECH was incorporated on August 7, 1987 as a Department of Defense (DOD)/Industry partnership. Its focus was to develop world-leading semiconductor manufacturing within the United States. The initial organization consisted of 14 semiconductor manufacturing companies"

"Members of the Consortium recognized the United States' weakness was in manufacturing technology, not in basic semiconductor design or research. Therefore, they established an overall goal to match Japan's manufacturing capabilities by 1993." [The Consortium included, Intel, TI, Motorola, National Semiconductor]

"Technology Development: SEMATECH has established and demonstrated baseline processes that have increased chip production by four fold on single silicon wafers. This was achieved solely through the use of American manufacturing equipment. Simultaneously, SEMATECH has achieved greater conductor density. The initial baseline was established using 0.8 micron and 0.65 micron geometries on 100 mm wafers. Transitioning through 0.50 microns on 150 mm wafers, they are now able to manufacture chips with 0.35 micron conductors on 200 mm wafers.29 This represents parity with Japan's best semiconductor technology. In 1993, U.S. manufacturers are expected to surpass Japan through the introduction of a 0.25 micron manufacturing capability that has already been demonstrated by SEMATECH in low production runs. Further developments in the next few years are expected to reduce conductor size to 0.18 microns."

The subsidies continue, as you mentioned, now CHIPS (which is modeled off of SEMATECH).

It just seems as though you are grasping at trying to support an assertion (subsidies are bad, the internet wasn't subsidized) when the reality is that all sorts of industry in the US is subsidized, either initially, midstream, sometimes continuously, either directly with cash, tax/investment credits or indirectly like general tax provisions.
You're quoting SEMATECH's marketing material.

A client? What is your expertise? Have you ever worked in the semiconductor industry? SEMATECH's output was specifications, patents, and research papers. And to get preferential licensing treatment of SEMATECH patents you had to be a member of the consortium, just like with other industry groups. No one could afford to be left out.
 
OP
MediumRare

MediumRare

Major Contributor
Forum Donor
Joined
Sep 17, 2019
Messages
1,956
Likes
2,283
Location
Chicago
OP
MediumRare

MediumRare

Major Contributor
Forum Donor
Joined
Sep 17, 2019
Messages
1,956
Likes
2,283
Location
Chicago
Yes but there is excess solar in the day and if a city needs the power only at night H2 may be the answer. Its going to take a lot of batteries to power Chicago. transportation is a big bugaboo for H2.
Agreed re: excess solar and wind, that’s why this tread is about Grid Storage in all its forms.
 

blueone

Major Contributor
Forum Donor
Joined
May 11, 2019
Messages
1,195
Likes
1,547
Location
USA
Because you have expectation bias, you assume it doesn't exist because it doesn't fit your premise, thus you probably didn't really look.

This took 2 seconds to find using "internet" "subsidies" and "history".


A quote from that article (the emphasis in bold is mine, as well as the comments in brackets):

NSFNET went online in 1986 and connected the supercomputer centers at 56,000 bits per second—the speed of a typical computer modem today. In a short time, the network became congested and, by 1988, its links were upgraded to 1.5 megabits per second. A variety of regional research and education networks, supported in part by NSF, were connected to the NSFNET backbone, thus extending the Internet’s reach throughout the United States.

Creation of NSFNET was an intellectual leap. It was the first large-scale implementation of Internet technologies in a complex environment of many independently operated networks. NSFNET forced the Internet community to iron out technical issues arising from the rapidly increasing number of computers and address many practical details of operations, management and conformance.

Throughout its existence, NSFNET carried, at no cost to institutions, any U.S. research and education traffic that could reach it. At the same time, the number of Internet-connected computers grew from 2,000 in 1985 to more than 2 million in 1993. To handle the increasing data traffic, the NSFNET backbone became the first national 45-megabits-per-second Internet network in 1991.

The history of NSFNET and NSF's supercomputing centers also overlapped with the rise of personal computers and the launch of the World Wide Web in 1991 by Tim Berners-Lee and colleagues at CERN, the European Organisation for Nuclear Research, in Geneva, Switzerland. The NSF centers developed many tools for organizing, locating and navigating through information, including one of the first widely used Web server applications. But perhaps the most spectacular success was Mosaic, the first freely available Web browser to allow Web pages to include both graphics and text, which was developed in 1993 by students and staff working at the NSF-supported National Center for Supercomputing Applications (NCSA) at the University of Illinois, Urbana-Champaign. In less than 18 months, NCSA Mosaic became the Web "browser of choice" for more than a million users and set off an exponential growth in the number of Web servers as well as Web surfers. Mosaic was the progenitor of modern browsers such as Microsoft Internet Explorer and Netscape Navigator.

Privatization: 1993-1998. Commercial firms noted the popularity and effectiveness of the growing Internet and built their own networks. The proliferation of private suppliers led to an NSF solicitation in 1993 that outlined a new Internet architecture that largely remains in place today.

From that solicitation, NSF awarded contracts in 1995 for three network access points, to provide connection points between commercial networks, and one routing arbiter, to ensure an orderly exchange of traffic across the Internet. In addition, NSF signed a cooperative agreement to establish the next-generation very-high-performance Backbone Network Service. A more prominent milestone was the decommissioning of the NSFNET backbone in April 1995.

[If you wish to dig deeper, you will find these contracts went to IBM, MCI and a couple of others. IBM eventually sold there internet infrastructure to AT&T which had finally seen the light. ]

In the years following NSFNET, NSF helped navigate the road to a self-governing and commercially viable Internet during a period of remarkable growth. The most visible, and most contentious, component of the Internet transition was the registration of domain names. Domain name registration associates a human-readable character string (such as “nsf.gov”) with Internet Protocol (IP) addresses, which computers use to locate one another.

The Department of Defense funded early registration efforts because most registrants were military users and awardees. By the early 1990s, academic institutions comprised the majority of new registrations, so the Federal Networking Council (a group of government agencies involved in networking) asked NSF to assume responsibility for non-military Internet registration. When NSF awarded a five-year agreement for this service to Network Solutions, Inc. (NSI), in 1993, there were 7,500 domain names.

In September 1995, as the demand for Internet registration became largely commercial (97 percent) and grew by orders of magnitude, the NSF authorized NSI to charge a fee for domain name registration. Previously, NSF had subsidized the cost of registering all domain names. At that time, there were 120,000 registered domain names. In September 1998, when NSF’s agreement with NSI expired, the number of registered domain names had passed 2 million.

The year 1998 marked the end of NSF’s direct role in the Internet. That year, the network access points and routing arbiter functions were transferred to the commercial sector
. And after much debate, the Department of Commerce’s National Telecommunications and Information Administration formalized an agreement with the non-profit Internet Corporation for Assigned Numbers and Names (ICANN) for oversight of domain name registration. Today, anyone can register a domain name through a number of ICANN-accredited registrars.

NSF after NSFNET. The decommissioning of NSFNET and privatization of the Internet did not mark the end of NSF’s involvement in networking. NSF continues to support many research projects to develop new networking tools, educational uses of the Internet and network-based applications.

[End of quote]

Generally, this is how every utility in the US was started, or modernized. Power, telephone, hydroelectric infrastructure, even the transcontinental railroad. You can research any specific aspect of the transition you wish. You will find that Vinton was at the center of the transition, including working at MCI on their email system.
You're confusing subsidizing construction of the US internet (my point, which it didn't) with the NSF's effort to standardize networking, so the DoD and its agencies did not have to try to inter-network a bunch of proprietary networking architectures from IBM and the BUNCH computer companies (pun intended). I'm not finding it fun or educational for others to continue discussing it with you, especially when you have a habit of starting out with an insult.
 

Travis

Senior Member
Joined
Jul 9, 2018
Messages
455
Likes
552
You're quoting SEMATECH's marketing material.

A client? What is your expertise? Have you ever worked in the semiconductor industry? SEMATECH's output was specifications, patents, and research papers. And to get preferential licensing treatment of SEMATECH patents you had to be a member of the consortium, just like with other industry groups. No one could afford to be left out.
It's not their marketing materials. It's an annual report to congress by GAO, one was filed every year there was funding, it's how they track things like that. There output was dramm memory, and wafers. The production facility ran 24/7/365. They also tested and improved production equipment.

I handled a wide variety of legal matters for SEMATECH, their CEO at the time and others. You literally have no idea what you are talking about which is becoming more and more clear with each post you make.
 
Last edited:

Travis

Senior Member
Joined
Jul 9, 2018
Messages
455
Likes
552
You're confusing subsidizing construction of the US internet (my point, which it didn't) with the NSF's effort to standardize networking, so the DoD and its agencies did not have to try to inter-network a bunch of proprietary networking architectures from IBM and the BUNCH computer companies (pun intended). I'm not finding it fun or educational for others to continue discussing it with you, especially when you have a habit of starting out with an insult.
Wasn't meant as an insult, just a logical deduction that you couldn't find anything. It's either you never looked, you looked and ignored what you found, or you didn't look very hard due to expectation bias. The last was giving you the benefit of the doubt, not an insult, we all have bias.

You are not finding it fun or educational because your premise is flawed and the facts on how the internet was created in the US don't match what you have represented - that's never fun. It may not be educational to you, but it is educational to others as to how much fact-checking they need to do with you (or me, or anyone else) which is really the only purpose at this point.
 

blueone

Major Contributor
Forum Donor
Joined
May 11, 2019
Messages
1,195
Likes
1,547
Location
USA
It's not their marketing materials. It's an annual report to congress by GAO, one was filed every year there was funding, it's how they track things like that. There output was dramm memory, and wafers. The production facility ran 24/7/365. They also tested and improved production equipment.
I'm sure SEMATECH always had a lab facility for research purposes. After 2010, in their reconstitution in Albany, I understand they have access to more extensive research facilities, but I've read very little about their work. Still, the most they do is produce test chips. Like IBM does. Are you talking about pre or post 2010 in your association with them, or both?

BTW, there's no such thing as "dramm memory", the acronym is DRAM, which stands for Dynamic Random Access Memory. It's dynamic because it is volatile, and needs periodic refreshing to store data (like every 64ms).

I handled a wide variety of legal matters for SEMATECH, their CEO at the time and others. You literally have no idea what you are talking about which is becoming more and more clear with each post you make.
Ah, a non-technical attorney. I obviously don't know what I'm talking about because I don't agree with you. How do you know if I have no idea what I'm talking about if you don't know how the internet works or its technical and implementation evolution?
 

j_j

Major Contributor
Audio Luminary
Technical Expert
Joined
Oct 10, 2017
Messages
2,282
Likes
4,792
Location
My kitchen or my listening room.
Fires, floods, droughts, storms and surges, and ground water depletion all on the rise and the highest recorded temperatures at record were in last couple of days. Sure medicine, sanitation and federalizer have kept people healthy and living longer.
Of course the point that humanity is supposed to be smart enough to anticipate catastrophe really does suggest that we should be paying attention now, hence the concern.
 
Status
Not open for further replies.
Top Bottom