• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Master AI (Artificial Intelligence) Discussion/News Thread

Just like the dot com era, AI will just be the norm like cell phones, and internet interaction with many white collar jobs disappearing to robots and small VC funded companies will be write offs. As computing power increases along with the distillation of existing data, emergent proprietary databases AI will be made essential with large interdependent data crunched in real time for immediate use in applications and automation that function in real time. Its general use by the public will go the way of satellite navigation, purchasing, and screen based entertainment, that will be ubiquitous and taken for granted, not magic or scourge. The use of data centers is unclear without proprietary data, applications or technology.

New chipsets will be developed that require 1/10th the energy to operate advanced AI systems and special CPU cooling will no longer be required. Existing tech will be retrofitted until its end of life. The physical space required for AI support will shrink as efficiency expands. Electric usage by these centers will drop exponentially. Electric companies will have excess electricity capacity that becomes idle. :cool:
 
New chipsets will be developed that require 1/10th the energy to operate advanced AI systems and special CPU cooling will no longer be required. Existing tech will be retrofitted until its end of life. The physical space required for AI support will shrink as efficiency expands. Electric usage by these centers will drop exponentially. Electric companies will have excess electricity capacity that becomes idle. :cool:

Will this be before or after my hair grows back (asking for a friend).
 
Electric companies will have excess electricity capacity that becomes idle. :cool:
Or sold at a competitive discount as renewable sources are at more than a sustainable and profitable foothold. Guessing that few power plant powered data centers will be built.
 
New chipsets will be developed that require 1/10th the energy to operate advanced AI systems and special CPU cooling will no longer be required. Existing tech will be retrofitted until its end of life. The physical space required for AI support will shrink as efficiency expands. Electric usage by these centers will drop exponentially. Electric companies will have excess electricity capacity that becomes idle. :cool:

ya really think?....is a.i. going to invent this new electricity efficiency? :rolleyes:
 
New chipsets will be developed that require 1/10th the energy to operate advanced AI systems and special CPU cooling will no longer be required. Existing tech will be retrofitted until its end of life. The physical space required for AI support will shrink as efficiency expands. Electric usage by these centers will drop exponentially. Electric companies will have excess electricity capacity that becomes idle. :cool:
The world will never need more than five AI centers.
 
Seriously, chip power consumption can probably by improved by a factor of a thousand, perhaps a million, and eventually any person who can afford a car will be able to build their own LLM. IBM already has a theoretical path to massive efficiency improvements.

Could be ten years, or fifty, but it’s coming.

LLMs will not be required to have general intelligence. They will be function calls to apps, and the apps will be another layer, just as brains have layers. Like parfaits.
 
New chipsets will be developed that require 1/10th the energy to operate advanced AI systems and special CPU cooling will no longer be required. Existing tech will be retrofitted until its end of life. The physical space required for AI support will shrink as efficiency expands. Electric usage by these centers will drop exponentially. Electric companies will have excess electricity capacity that becomes idle. :cool:
Does not matter, 10x more efficient with 100x growth is 10x more power consumption.

------------

Btw, interesting concept, hardcoding to silicon.

 
ya really think?....is a.i. going to invent this new electricity efficiency?
No but managing electrical use with city wide municipal facilities energy, traffic control and street lights etc. and same with utilities infrastructure with smart devices feeding control algorithms to the grid and water management. Traffic control to from bridges at rush hour in major cities. These are systems that have too much data to handled manually and with the event so called smart devices and AI these are applications that could pay back handsomely.
 
Nope, as there are those who earn their money with it, and will do any effort for this ...
Have to admit - I am struggling to parse this :)
 
New chipsets will be developed that require 1/10th the energy to operate advanced AI systems and special CPU cooling will no longer be required. Existing tech will be retrofitted until its end of life. The physical space required for AI support will shrink as efficiency expands. Electric usage by these centers will drop exponentially. Electric companies will have excess electricity capacity that becomes idle. :cool:
Many companies worldwide are working on photonic chips which do not need much cooling and can process at much faster speeds. They can use analogue processing by using different frequencies of light to get results faster.
There are problems of course - the data needs converting from the binary input etc. Though I've seen many new techniques to cope with the difficulties.
I expect it will take five years before they take off for use in data centres though.
This vid is about the first photonic board which in production now -

 
Achieving a 10x reduction in AI energy consumption is not a pipe dream, but it requires moving away from the "brute force" scaling of current GPUs. While several technologies are in the race, In-Memory Computing (IMC) and Neuromorphic Computing are currently the most promising for near-term and long-term breakthroughs. Five research possibilities are listed below.

1. Neuromorphic chips are engineered to mimic the neural structure of the human brain, offering a 100 to 1,000-fold reduction in energy consumption per task compared to cloud-based.

2. Analog In-Memory Computing (IMC)
This technology merges processing and memory, solving the "von Neumann bottleneck" where energy is wasted moving data between the processor and memory.

3. Optical computing uses light (photons) instead of electricity (electrons) to perform calculations.

4. On-Device/Edge AI with Specialized ASICs
Moving AI processing from the cloud to the device (edge) removes the massive energy costs associated with data transmission to central data centers.

5. Algorithmic and Data Optimization (Software Layer)
Hardware advancements are being combined with software techniques to drastically cut power usage.
 
New chipsets will be developed that require 1/10th the energy to operate advanced AI systems and special CPU cooling will no longer be required. Existing tech will be retrofitted until its end of life. The physical space required for AI support will shrink as efficiency expands. Electric usage by these centers will drop exponentially. Electric companies will have excess electricity capacity that becomes idle. :cool:
AI centers will have excess capacity. Easy political target even voters will understand. If only “green” energy costs were understood by consumers. At least picked the right color. I’ll reopen my law office at my age with a guaranteed rate of return.
 
AI centers will have excess capacity. Easy political target even voters will understand. If only “green” energy costs were understood by consumers. At least picked the right color. I’ll reopen my law office at my age with a guaranteed rate of return.

Apparently they'll feed the hungry, cure the sick, comfort the lovelorn, enlighten the ignorant, satisfy the curious ... but perhaps you can defend the indefensible, just in case they missed that. :facepalm:
 
Achieving a 10x reduction in AI energy consumption is not a pipe dream, but it requires moving away from the "brute force" scaling of current GPUs. While several technologies are in the race, In-Memory Computing (IMC) and Neuromorphic Computing are currently the most promising for near-term and long-term breakthroughs. Five research possibilities are listed below.

1. Neuromorphic chips are engineered to mimic the neural structure of the human brain, offering a 100 to 1,000-fold reduction in energy consumption per task compared to cloud-based.

2. Analog In-Memory Computing (IMC)
This technology merges processing and memory, solving the "von Neumann bottleneck" where energy is wasted moving data between the processor and memory.

3. Optical computing uses light (photons) instead of electricity (electrons) to perform calculations.

4. On-Device/Edge AI with Specialized ASICs
Moving AI processing from the cloud to the device (edge) removes the massive energy costs associated with data transmission to central data centers.

5. Algorithmic and Data Optimization (Software Layer)
Hardware advancements are being combined with software techniques to drastically cut power usage.
Once you accept that AI does not require bit perfect logic, you can think about analog, or at least alternative architectures.

Maybe more that one architecture will survive.
 
AI centers will have excess capacity. Easy political target even voters will understand. If only “green” energy costs were understood by consumers. At least picked the right color. I’ll reopen my law office at my age with a guaranteed rate of return.

If only “fossil fuel” energy costs were understood by consumers and voters………
 
Back
Top Bottom