This story draft by @escholar has not been reviewed by an editor, YET.

Generalization to different grids and datasets

EScholar: Electronic Academic Papers for Scholars HackerNoon profile picture
0-item

Authors:

(1) Vladislav Trifonov, Skoltech ([email protected]);

(2) Alexander Rudikov, AIRI, Skoltech;

(3) Oleg Iliev, Fraunhofer ITWM;

(4) Ivan Oseledets, AIRI, Skoltech;

(5) Ekaterina Muravleva, Skoltech.

Table of Links

Abstract and 1 Introduction

2 Neural design of preconditioner

3 Learn correction for ILU and 3.1 Graph neural network with preserving sparsity pattern

3.2 PreCorrector

4 Dataset

5 Experiments

5.1 Experiment environment and 5.2 Comparison with classical preconditioners

5.3 Loss function

5.4 Generalization to different grids and datasets

6 Related work

7 Conclusion and further work, and References

Appendix

5.4 Generalization to different grids and dataset

We also observe a good generalization of our approach when transferring our preconditioner between grids and datasets (Figure 4). The transfer between datasets of increasing and decreasing complexity does not lead to a loss of quality. This means that we can train the model with simple PDEs and then use it with complex ones for inference. If we fix the complexity of the dataset and try to transfer the learned model to other grids, we observe a loss of quality of only about 10%.




This paper is available on arxiv under CC by 4.0 Deed (Attribution 4.0 International) license.


L O A D I N G
. . . comments & more!

About Author

EScholar: Electronic Academic Papers for Scholars HackerNoon profile picture
EScholar: Electronic Academic Papers for Scholars@escholar
We publish the best academic work (that's too often lost to peer reviews & the TA's desk) to the global tech community

Topics

Around The Web...

Trending Topics

blockchaincryptocurrencyhackernoon-top-storyprogrammingsoftware-developmenttechnologystartuphackernoon-booksBitcoinbooks
OSZAR »