新光子芯片突破:速度提高 1000 倍,是真的吗?
2024-04-01 辽阔天空 9419
正文翻译



新光子芯片突破:速度提高 1000 倍。是真的吗?
原创翻译:龙腾网 https://www.ltaaa.cn 转载请注明出处


Anastasi In Tech

2024年3月16日

Timestamps:
00:00 - Intro
03:16 - Lithium Niobate
05:56 - How does this chip work?

时间戳:
00:00 -引言
03:16 -铌酸锂
05:56 -这种芯片是怎么工作的?

评论翻译
@TickerSymbolYOU
This is absolutely HUGE. Photonics needs way more coverage. Great breakdown of the paper!

这绝对是很受欢迎的,光子学需要更多的新闻报道。这篇论文的分析得真棒!

@ChipChat1493
This lady can easily be a professor in practically 99 percent of the Universities of the world! In one compelling video she bring so much "light" to the basic understanding of the world of photonic chips.
As Einstein once said to the effect, if you cannot explain something simply so that a six year old can also understand it, then you do not know it yourself!

这位女士可以轻而易举地成为世界上99%的大学的教授!在一个引人注目的视频中,她为光子芯片世界的基本理解带来了很多“光”
正如爱因斯坦曾经说过的那样,如果你不能简单地解释一件事,以至于一个六岁的孩子也能理解它,那么你自己也没有了解它!

原创翻译:龙腾网 https://www.ltaaa.cn 转载请注明出处


@steveseidel9967
I remember writing a paper on optical computing back in the late 80's. There were high hopes back then. Not much has happened in this space since that time. It's encouraging to see some progress.

我记得在80年代末写过一篇关于光学计算的论文。当时人们寄予厚望,从那时起,这个领域都没有什么太过有影响力的事情,看到一些进展令人鼓舞。

@scottgrout2640
I ran an optical switching startup in the late 90's. Met several times with the NSA to explore optical computing. Nice to see it developing.

我在90年代末开了一家光交换公司。曾几次与国家安全局会面探讨光学计算。很高兴看到它的发展。

@WarrenLacefield
It is funny. Your comment bought to mind Stephen Wolfram's idea of "computing" (where pretty much anything that exists dynamically is doing that). Actually, IMO, there have been huge advances in photovoltaic and photonics in the last 20-30 years. Your high resolution TV and display screens are examples, as are solar power and Starlix and other satellite networks interconnected by lasers, not to mention bio-medical applications, GPS, all sorts of sensors, etc. Google says China is first in this field - but that may be for applications. Numerous universities (Colorado at Boulder, Stanford, MIT, University of Rochester come to mind) are doing most of the world-class research in these fields IMO.

这很有趣。你的评论让人想起了史蒂芬·沃尔夫勒姆(Stephen Wolfram)关于“计算”的想法(几乎任何动态存在的东西都是这样做的)。事实上,在我看来,在过去的20-30年里,光伏和光子学取得了巨大的进步。你用的高分辨率电视和显示屏就是例子,太阳能、星链和其他通过激光互连的卫星网络也是例子,更不用说生物医疗应用、GPS、各种传感器了。谷歌表示,中国在这个领域是领先的,但这可能是针对应用方面。在我看来,许多大学(科罗拉多大学博尔德分校、斯坦福大学、麻省理工学院、罗切斯特大学)正在这些领域进行大多数世界级的研究。

@ladygreen632
I remember pencils and paper. I miss the good old days

我记得铅笔和纸,我怀念过去的美好时光

@davestorm6718
Well, actually there's been a lot of progress, just not as general purpose CPU computing, but in fiber optic communications switching (it's pretty much right under everyone's nose). :)

嗯,实际上已经有了很大进步,只是不是通用的CPU计算,而是光纤通信交换方面(它几乎就在每个人的眼皮底下进行)。

@scottgrout2640
@davestorm6718? Correct. Been all-optical switches in comm's networks for over 20 years now. And, optical compute is far different than optical switch.

正确的。作为通信网络的全光交换机已经有20多年了。光计算和光开关有很大的不同。
原创翻译:龙腾网 https://www.ltaaa.cn 转载请注明出处


@lasselasse5215
TBH as a programmer, I don't understand your domain well enough to understand everything you teach.
But I am curious, and the main takeaway for me from your presentations in general is that it's very inspirational. You have a very good way of explaining things. Thank you for that!

老实说,作为一个程序员,我对你的领域还不够了解,无法理解你教的所有内容。
但我很好奇,总的来说,你们的演讲给我的主要收获是非常鼓舞人心,你解释事情的方式很不错,谢谢你!

@SomeUserNameBlahBlah
Don't worry, the people who do understand will develop a frxwork or library for you to connect up to, if needed.

不要担心,如果需要的话,理解的人会开发一个框架或库供您连接的。

@monad_tcp
This is an amazing way of injecting a lot of data into a silicon microchip so you can actually have faster processing. Heck, we can even bring back ring-loops as memory, store it at the GHz range inside a loop of optic fiber. Terabytes of memory faster than DRAM.
Forget about using it for computing, just moving data and storing data are already amazing. Modern CPU/GPUs already spend most of their time just waiting for data to come to their caches. The rate at which computing can happen is severely restricted by memory and bandwidth speeds.

这是一种将大量数据注入硅晶片的神奇方式,这样你就可以有更快的处理速度。见鬼,我们甚至可以将环路作为记忆,将其存储在光纤环路中,频率为千兆赫兹。兆兆字节的内存比动态随机存取存储器(DRAM)快。
忘记用它来计算吧,仅仅是移动数据和存储数据就已经很了不起了,现代CPU/ gpu已经花费了大部分时间等待数据到达缓存。计算速度受到内存和带宽速度的严重限制。

@WarrenLacefield
You make a good point, this "chip" and similar devices might be better suited for data transfer and communications (not sure about "data storage") than for typical computing. At least perhaps for the time being.

你说得很好,这种“芯片”和类似的设备可能更适合于数据传输和通信(不确定是否可以“数据存储”),而不是典型的计算。至少暂时是这样。

@joshua43214
I still remember the first time I allocated a Tb of memory on our HPC, the sense of power was awesome.
Still took half a day to process my data though :(

我还记得第一次在我们的高性能计算机上分配1tb内存时,那种强大的感觉太棒了。
不过我还是花了半天的时间来处理我的数据

@stevesteve8098
They are "storing" the data feeds in optical fiber BEFORE processing them.

在处理数据之前,他们将数据“存储”在光纤中。

@IakobusAtreides
Was looking forward to your take on this paper. Thank you

我很期待你对这篇论文的看法。谢谢你!

@W4rcrafter
Your channel seems a bit ahead of its time. Keep up the good work, you're great !

你的频道似乎有点超前。继续努力,你很棒!

@juanluismartinez4587
This could be streamlined AI.

这可能是简化的AI。

@popquizzz
Complexity of systems increases the probabilities of errors and breakdowns.

系统的复杂性增加了出错和故障的可能性。

@hubertdaugherty8986
SAW (Surface Acoustic Wave) devices are conceptually similar. The interaction and coupling of different energy types. Resonators, couplers and transmission lines are designed into the system to accommodate each energy type.
Reliable low power laser emitters are readily available from the fiber optic communications industry.
For photonics to expand there needs to be a toolbox of active elements which can be simulated.
Much like the inductive, resistive, and capacitive components in electronics.

表面声波(SAW)设备在概念上是相似的。不同能量类型之间的相互作用和耦合。谐振器、耦合器和传输线被设计到系统中,以适应每种能量类型。
可靠的低功耗激光发射器可以轻松地从光纤通信行业中获得。
为了扩展光子学,需要有一套可以模拟的活性元件工具箱。
就像电子学中的感性、阻性和容性元件一样。

原创翻译:龙腾网 https://www.ltaaa.cn 转载请注明出处


@bowlingballz_
You remind of one of my old professors. She was an older woman and she was a wonderful professor and I'm very fond of her. She taught Data Structures and algorithms. Thanks for the breakdown of this. It's very clear and understandable.

你让我想起了我以前的一位教授。她年纪比我大,是个很棒的教授,我很喜欢她,她教授数据结构和算法。谢谢你的分析,非常清晰易懂。

@bbamboo3
Good to see a sober assessment of the developments. Eager to see systems that exploit the full parallelism possible with optical computing.

很高兴看到对事态发展的冷静评估。渴望看到能够充分利用光学计算的并行性的系统。
原创翻译:龙腾网 https://www.ltaaa.cn 转载请注明出处


@misterbum1
Me no good at science. Your presentations are so clear, however, that even those of us without great scientific backgrounds can garner the gist of your messages. Really, really, well done. Thank you.

我不擅长科学。然而,你的演讲是如此清晰,即使我们这些没有科学背景的人也能领会你信息的要点。真的不错,谢谢你!

@daddy7860
I'm super excited to see the analog computing applications of photonics, especially around the matrix multiplication arenas. I can see it being a game-changing step to the next revolution of tech

我非常兴奋地看到光子学的模拟计算应用,特别是在矩阵乘法领域。我认为这将是下一个科技革命的转折点

@scottwatschke4192
Thank you for another interesting video. I look forward to the day when the technology is perfected for photonics. I think it'll be history making.

谢谢你又一个有趣的视频。我期待着光子学技术完善的那一天,我想这将创造历史。

@yoyo-jc5qg
This is crazy technology if they can master it, because of lights ability to be split into different wavelengths or colors this will significantly increase data storage and speed, it'll turn silicon valley into the stone age

如果他们能掌握这项技术,那就太疯狂了,因为光可以被分裂成不同的波长或颜色,这将显著增加数据存储和速度,这会让硅谷回到石器时代。

@416dl
In the blx of an eye...thanks again for a great glimpse at the future.Cheers.

一眨眼的功夫…再次感谢你让我看到了未来,谢谢。

@siyabongampongwana990
I think the next age of innovation is sort of dependant on breakthroughs in Material Sciences, it is the key that will unlock a whole lot and every thing will just fall in to place and we will move faster and much further.

我认为下一个创新时代在某种程度上取决于材料科学的突破,它将是解锁许多东西的关键,一切都将就位,我们会发展得更快更远。
原创翻译:龙腾网 https://www.ltaaa.cn 转载请注明出处


@bigbluebuttonman1137
Photonics is very intriguing. It’ll be very interesting to see how it goes.

光子学非常有趣。看它怎么发展会很有意思。

@dougcox835
I remember when this sort of thing first appeared decades ago. My thought then was that it may be fast but it still needs to interface to regular electronics at some point ant that would be the bottleneck.

我还记得几十年前这种东西第一次出现的时候。我当时的想法是,它可能很快,但在某些时候,它仍然需要与常规电子设备接口,这将是瓶颈。

@DihelsonMendonca
Imagine photonics and quantum computing...

想象一下光子学和量子计算

@psaicon0
I learn a lot from you! thanks for posting these videos!

我从你身上学到了很多!谢谢你上传这些视频!

@constantinosschinas4503
I remember University of Crete, in Greece, had breakthrougg research on this type of computing, 20 years ago.

我记得希腊的克里特岛大学,在20年前,对这类计算进行了突破性的研究。

@id104335409
There have been recent breakthroughs in
1 AI
2 Quantum
3 Photonics
Now imagine an AI running on a Photonic Quantum computer!
It would make all our existing computers as useful as handheld calculators.

最近在这些领域有所突破:人工智能、量子、光子学
现在想象一下在光子量子计算机上运行的人工智能!
它将使我们现有的所有计算机都像手持计算器一样有用。

@monad_tcp
Doesn't even need to be entirely photonics. Specially if later we develop a way of fusing photonics and electronics.
You can do a lot of computing in parallel, so the latency in electronics wasn't a problem, the problem is always the von-newman bottleneck, the bandwidth at which you can inject data into a silicon chip.
For ex, H100 TPU can do merely 3.35TB/s of data transfer internally. It does 51 Teraflops (FP32) but because it can only transfer 250GB/s from memory, it doesn't get even closer to that.
The H100 has 456 tensor units, if you could feed data without ever having to wait a single clock cycle, you would need at least 153TB/s.
But if your chip can do 1 multiplication per clock cycle (*1) at 3.5Ghz, that would at least consume 0.336 TB/s for a single unit, or 153TB/s total.
So if you could feed data at the rate it can consume, you can easily do 50 times more computing. You can do 2.5 Petaflops instead per chip.
(*1 - tensor cores in the volta actually have 5 stages, those at least use 5 cycles per "instruction", I'm considering 3 floats inputs)
But there's no way to inject that much data into a silicon chip, the die size would have to be huge for the amount of pins it would need to have. Maybe some IBM mainfrxs can do that. 150TB/s is a lot of data.
Now with photonics, that's a possible optimization, far away (aka, in the same server, but not on the same PCB) you can have terabytes of RAM in parallel feeding data to a microwave emitter and then to a light pipe and then to a single microchip. (it would still need to have another back-converter to microwave and a receptor to feed the data to the silicon above, but that would be flip-chip interconnect instead)
Also you can now make it even denser with much more tensor units and remove all the cruft used for managing caches, you don't need caching or even registers, its pure computing, data in / data out.
You can even do crazy things like active cooling inside a microchip if you don't have so much die size being spent on interconnect or cache.

甚至不需要完全是光子学,特别是如果以后我们发展出一种融合光子学和电子学的方法。
你可以并行进行大量的计算,所以电子学中的延迟不是问题,问题总是冯·诺依曼瓶颈,你可以将数据注入硅芯片的带宽。
例如,H100 TPU内部只能进行3.35TB/s的数据传输。它可以进行51 TeraFLOPS(FP32)的计算,但由于它只能从内存中传输250GB/s的数据,它甚至无法接近这个速度。
H100有456个张量单元,如果你可以在不必等待一个时钟周期的情况下提供数据,你至少需要153TB/s的速度。
但如果你的芯片可以在每个时钟周期进行1次乘法(*1)计算,以3.5GHz的频率运行,那么至少会消耗0.336TB/s的速度用于单个单元,或者总共153TB/s。
所以如果你能以它可以消耗的速度提供数据,你可以轻松地进行50倍的计算。每个芯片可以进行2.5 Petaflops的计算。
(*1 - Volta中的张量核心实际上有5个阶段,至少每个“指令”使用5个周期,我考虑的是3个浮点输入)
但是没有办法将那么多数据注入硅芯片,芯片尺寸将不得不非常大,以容纳它需要的引脚数量。也许有些IBM大型机可以做到这一点。150TB/s是很大的数据量。
现在有了光子学,这是一个可能的优化,你可以在远处(即在同一台服务器中,但不在同一块PCB上)拥有数TB的RAM并行提供数据给微波发射器,然后通过光波导到单个微芯片。(它仍然需要另一个反向转换器将数据从微波转换回光波,以及一个接收器将数据馈送到上面的硅芯片,但这将是倒装芯片互连)
你甚至可以使它更密集,拥有更多的张量单元,并移除所有用于管理缓存的冗余部分,你不需要缓存甚至寄存器,它是纯粹的计算,数据输入/数据输出。
你甚至可以在微芯片内部进行主动冷却,如果你没有那么多的芯片尺寸被用于互连或缓存。

@rremnar
Handheld calculators are still useful Especially the ones that use solar for power. I still have mine from decades ago, that works, using LCD and a small solar panel.

手持计算器仍然很有用,尤其是那些使用太阳能发电的计算器。我还留着几十年前的那台,用的是液晶显示器和一个小太阳能电池板。

@alansmithee419
I don't think a photonic quantum computer would benefit from many of the advancements in classical photonic computing.
I could be wrong, but I imagine quantum vs classical photonics is a whole different beast.
But yeah, photonics could certainly speed up AI if it gets good enough, and quantum computers make AI go crazy, but that's probably further away. We'd probably have to relearn a lot of what we know about training classical AIs too.

我不认为光子量子计算机会从经典光子计算的许多进步中受益。
我可能是错的,但我想象量子与经典光子学是完全不同的领域。
不过,如果光子学足够好,它当然可以加速人工智能的发展,量子计算机可以让人工智能变得疯狂,但这可能还有一段距离。我们可能还需要重新学习许多关于训练经典人工智能的知识。

@gregbarber8166
i believe they have a long road ahead Anastasii but i think they might have a few good innovations, theoretically it is exciting easyer to understand good job !

我相信他们还有很长的路要走,但我认为他们可能会有一些很好的创新,理论上这是令人兴奋的,更容易理解,干得好!

@givemeyourmoneynao
The problem with all forms of analog computing is storing results for multi step computations or to use those results later. We can easily do this with digital circuits. The problem with analog computing is that you need to convert the analog signals to digital to store the result, then convert them back to analog to process further.
The additional hardware and power consumption to do this conversion greatly out weighs all the benefits. There needs to be a breakthrough in how we can convert or store analog results in order for this technology to be beneficial.

所有形式的模拟计算的问题都是为多步计算存储结果或稍后使用这些结果。我们可以很容易地用数字电路做到这一点。模拟计算的问题在于,你需要将模拟信号转换为数字信号以存储结果,然后将其转换回模拟信号以进一步处理。
进行这种转换的额外硬件和功耗大大超过了所有的好处。为了使这项技术有益,我们需要在如何转换或存储模拟结果方面取得突破。

@jimg8296
I have been saying photonics is the ultimate in processing for 20 years, finally good to sees it's starting to take hold.

20年来,我一直在说光子学是处理技术的终极,终于很高兴看到它开始站稳脚跟。
原创翻译:龙腾网 https://www.ltaaa.cn 转载请注明出处


@Lubossxd
so photonic chips are becoming more possible, interesting. we do indeed need a new breakthrough in technology in order to get more significant improvements per generation. feels like we are hitting a ceiling past few years

所以光子芯片变得越来越有可能,越来越有趣。我们确实需要在技术上取得新的突破,以便每一代都能取得更大的进步。感觉我们在过去的几年里触到了天花板

@Rachelebanham
i used to design ring resonators, tapers and couplers.

我曾经设计过环形谐振器,锥形器和耦合器。

@ThomasTomiczek
The big problem here is not the chip - it is manufacturing. The idea is amazing, but they need to get the whole chain up for fabs to implement that in mass, then someone designs the chips. The later is quite easy - but the supply chain for the whole fabs will likely take years. There is a hugh push - you can sell those chips VERY expensive for some time, because - at the end - you are 1000 times faster, AND you also use a lot less energy (the later being money spent and infrastructure requirements

这里的大问题不在于芯片,而在于制造。这个想法很神奇,但他们需要整条生产线,让晶圆厂大规模实施,然后有人设计芯片。后者比较容易,但整个晶圆厂的供应链可能需要数年时间。这是一个巨大的推动力——你可以在一段时间内以非常昂贵的价格出售这些芯片,因为——最终,你的速度会提高1000倍,而且你使用的能源更少(后者是资金支出和基础设施需求)

@Flameboar
Thanks for the video. Photonics may be useful in the future, but as you explained, this technology must go through considerable development before it is ready to replace electronics. I've been able to watch the progress of lithography, including several promising technologies which never reached large scale commercialization.

感谢您的视频。光子学在未来可能很有用,但正如你所解释的,这项技术在准备取代电子学之前必须经过相当重要的发展。我见证了光刻技术的进步,包括一些从未实现大规模商业化的有前途的技术。

@Sprengstoff
Very cool, photonics is very exiting now that we are are approaching the limit for silicon.

非常酷,光子学非常令人兴奋,现在我们正在接近硅的极限。

@JMeyer-qj1pv
Anything we can do to lower power usage for AI will be good for the planet. With photonics I question if the logic density can ever compete with what has been achieved with silicon gates. I'm not sure quantum computing will be the next big thing, since I think the next big thing will probably be discovered using AI, and it will come up with an approach we haven't even considered before!

我们能做的任何降低人工智能能耗的事情都对地球有好处。对于光子学,我怀疑逻辑密度是否能与硅栅所取得的成就相竞争。我不确定量子计算是否会成为下一个大事件,因为我认为下一个大事件可能是使用人工智能发现的,它将提出一种我们以前从未考虑过的方法!

@QuikRay
Actually, It's much faster than 1000 times when you start breaking it down into different colors....yet future, but we will get there.

实际上,当你开始把它分解成不同的颜色时,它比1000倍快得多....虽然是未来,但我们会实现的。

很赞 5
收藏