LoGAH: Predicting 774-Million-Parameter Transformers using Graph HyperNetworks with 1/100 ParametersPublished in Arxiv, 2024Share on Twitter Facebook LinkedIn Previous Next