|  Guillaume Wenzek | d5b035f230
							
							fix PositionalEmbedding with cache | 1 год назад | 
				
					
						|  Guillaume Wenzek | 8c074387a9
							
							s2tt generate the sample if needed | 1 год назад | 
				
					
						|  Guillaume Wenzek | ac4a2eada9
							
							job.num_threads | 2 лет назад | 
				
					
						|  Guillaume Wenzek | f6d810543d
							
							fix incremental decoding ! | 1 год назад | 
				
					
						|  Guillaume Wenzek | 522b97234e
							
							WIP: simple failing test case | 1 год назад | 
				
					
						|  Guillaume Wenzek | 2c543185e2
							
							add kv_cache to fairseq2_model | 1 год назад | 
				
					
						|  Guillaume Wenzek | 44a4ca129a
							
							read model config from "layer_config" | 2 лет назад | 
				
					
						|  Guillaume Wenzek | eb7810b81f
							
							force little-endian | 2 лет назад | 
				
					
						|  Guillaume Wenzek | c31926c1a8
							
							working out new way of saving hparams | 2 лет назад | 
				
					
						|  Ning | 1589c529bf
							
							unity.cpp speech_encoder_frontend+speech_encoder+adaptor (#81) | 2 лет назад | 
				
					
						|  Guillaume Wenzek | e38442d1f0
							
							rename mask to attn_mask in MultiheadAttention_forward | 2 лет назад | 
				
					
						|  Guillaume Wenzek | 6fbb465f2b
							
							generate_sequence return full results | 2 лет назад | 
				
					
						|  Guillaume Wenzek | 1756897d23
							
							simplify _finalize_hypothesis | 2 лет назад | 
				
					
						|  Guillaume Wenzek | f49763de86
							
							fix beam size and scores | 2 лет назад | 
				
					
						|  Guillaume Wenzek | c28db8c8ac
							
							wip beam-size=2 | 2 лет назад | 
				
					
						|  Guillaume Wenzek | b24dbe3030
							
							batching -> TransformerEmbeddingFrontend_forward | 2 лет назад | 
				
					
						|  Guillaume Wenzek | 86993cbd00
							
							fix StandardTransformerEncoder | 2 лет назад | 
				
					
						|  Guillaume Wenzek | 28ed039370
							
							fix MultiheadAttention_forward | 2 лет назад | 
				
					
						|  Guillaume Wenzek | 81cdf80eb9
							
							WIP: MultiheadAttention_forward | 2 лет назад | 
				
					
						|  Guillaume Wenzek | eb80195345
							
							use ggml_diag_mask_inf | 2 лет назад | 
				
					
						|  Guillaume Wenzek | 88b0690a72
							
							split tests files | 2 лет назад | 
				
					
						|  Guillaume Wenzek | bfbafd9603
							
							fix generation with beam_size=1 | 2 лет назад | 
				
					
						|  Guillaume Wenzek | 45f986055a
							
							add naive tweaking of lprobs | 2 лет назад | 
				
					
						|  Guillaume Wenzek | 7c9b2a1b95
							
							pass correct prefix sequence in test | 2 лет назад | 
				
					
						|  Guillaume Wenzek | c7b89f32f4
							
							disable flash attn because of cross attention | 2 лет назад | 
				
					
						|  Guillaume Wenzek | dcb9535666
							
							wip: generate_sequence | 2 лет назад | 
				
					
						|  Guillaume Wenzek | 78e7c9a311
							
							fix TransformerEmbeddingFrontend | 2 лет назад | 
				
					
						|  Guillaume Wenzek | 2238cea072
							
							SinusoidalPositionEncoder + WIP: TransformerEmbeddingFrontend | 2 лет назад | 
				
					
						|  Guillaume Wenzek | 2fb09f34fb
							
							generate fairseq2.cpp | 2 лет назад | 
				
					
						|  Guillaume Wenzek | f1f33dbec1
							
							has_layer + transformer decoder | 2 лет назад |