LIHKG PHD/Research討論區 (21)
我沒有放棄 2017-6-5 21:38:47
:^(

:^(

:^(

:^(

:^(

:^(

Ads

數學白痴 2017-6-5 21:49:11
:^(

:^(

:^(

:^(

:^(

:^(

:^(
鈴谷提督 2017-6-5 21:52:16 有無巴絲係出黎做左兩三年嘢再去讀PhD? 難唔難搵番Prof寫reference letter?

做落嘢就愈覺自己想做番research..

P.S. Master grad with Distinction
婆你呀麼彈彈波 2017-6-5 22:29:09 As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁
忒修斯之船 2017-6-5 22:36:59
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁

用discrete probability嘅setting大致上就係咁

KLD係指唔知乜divergence?

:^(
忒修斯之船 2017-6-5 22:41:21
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁

:^(
婆你呀麼彈彈波 2017-6-5 22:59:21
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁

用discrete probability嘅setting大致上就係咁

KLD係指唔知乜divergence?

:^(

:^(

:^(
婆你呀麼彈彈波 2017-6-5 22:59:58
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁

用discrete probability嘅setting大致上就係咁

KLD係指唔知乜divergence?

:^(

:^(

:^(

Ads

忒修斯之船 2017-6-5 23:05:48
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁

用discrete probability嘅setting大致上就係咁

KLD係指唔知乜divergence?

:^(

:^(

:^(

:^(
我沒有放棄 2017-6-5 23:10:48
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁

用discrete probability嘅setting大致上就係咁

KLD係指唔知乜divergence?

:^(

:^(

:^(

:^(

:^(
忒修斯之船 2017-6-5 23:12:52
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁

用discrete probability嘅setting大致上就係咁

KLD係指唔知乜divergence?

:^(

:^(

:^(

:^(

:^(

:^(
忒修斯之船 2017-6-5 23:18:36
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁

用discrete probability嘅setting大致上就係咁

KLD係指唔知乜divergence?

:^(

:^(

:^(

:^(

:^(

:^(

:^(
婆你呀麼彈彈波 2017-6-5 23:19:24

用discrete probability嘅setting大致上就係咁

KLD係指唔知乜divergence?

:^(

:^(

:^(

:^(

:^(

:^(

:^(
我沒有放棄 2017-6-5 23:19:44
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁

用discrete probability嘅setting大致上就係咁

KLD係指唔知乜divergence?

:^(

:^(

:^(

:^(

:^(

:^(

:^(
我沒有放棄 2017-6-5 23:34:01

:^(

:^(

:^(

:^(

:^(

:^(

:^(

:^(

:^(
婆你呀麼彈彈波 2017-6-5 23:34:58

:^(

:^(

:^(

:^(

:^(

:^(

:^(

:^(

Ads

婆你呀麼彈彈波 2017-6-5 23:47:31

:^(

:^(

:^(

:^(

:^(

:^(

:^(

:^(