. . "In statistica e teoria dell'informazione, l'informazione di Fisher \u00E8 la varianza dello score (derivata logaritmica) associato a una data funzione di verosimiglianza. L'informazione di Fisher, che prende il nome dal celebre genetista e statistico Ronald Fisher, pu\u00F2 essere interpretata come l'ammontare di informazione contenuta da una variabile casuale osservabile , concernente un parametro non osservabile , da cui dipende la distribuzione di probabilit\u00E0 di . Denotando l'informazione di Fisher con , poich\u00E9 il valore atteso dello score \u00E8 nullo, la sua varianza \u00E8 pari al suo momento del secondo ordine, cos\u00EC che: dove denota la funzione di verosimiglianza. Una scrittura equivalente \u00E8: ossia meno il valore atteso della derivata seconda della funzione di verosimiglianza rispetto a ; l'informazione di Fisher pu\u00F2 dunque essere letta come una misura della curvatura della verosimiglianza in corrispondenza della stima di massima verosimiglianza per . Una verosimiglianza piatta, con una derivata seconda modesta, comporter\u00E0 minore informazione, laddove una maggior curva apporter\u00E0 una maggiore quantit\u00E0 di informazione."@it . . "44496"^^ . . . . . . . . . . . . "\u6570\u7406\u7EDF\u8BA1\u5B66\u4E2D\uFF0C\u8D39\u5E0C\u5C14\u4FE1\u606F\uFF08\u82F1\u8BED\uFF1AFisher Information\uFF1B\u6709\u6642\u7A31\u4F5C information\uFF09\uFF0C\u6216\u7A31\u8CBB\u96EA\u8A0A\u606F\u6578\uFF0C\u901A\u5E38\u8BB0\u4F5C\uFF0C\u662F\u8861\u91CF\u89C2\u6D4B\u6240\u5F97\u7684\u968F\u673A\u53D8\u91CF\u643A\u5E26\u7684\u5173\u4E8E\u672A\u77E5\u6BCD\u6578\u7684\u8A0A\u606F\u91CF\uFF0C\u5176\u4E2D\u7684\u6982\u7387\u5206\u5E03\u4F9D\u8D56\u4E8E\u6BCD\u6578\u3002\u8D39\u5E0C\u5C14\u4FE1\u606F\u7531\u7EDF\u8BA1\u5B66\u5BB6\u7F57\u7EB3\u5FB7\u00B7\u8D39\u5E0C\u5C14\u5728\u5F17\u6717\u897F\u65AF\u00B7\u4F0A\u897F\u5FB7\u7F57\u00B7\u57C3\u5947\u6C83\u601D\u5DE5\u4F5C\u7684\u57FA\u7840\u4E0A\u63D0\u51FA\uFF0C\u73B0\u5E38\u7528\u4E8E\u6700\u5927\u4F3C\u7136\u4F30\u8BA1\u548C\u4E2D\u3002"@zh . . "Informacja Fishera"@pl . . . "L'information de Fisher est une notion de statistique introduite par R.A. Fisher qui quantifie l'information relative \u00E0 un param\u00E8tre contenue dans une distribution. Elle est d\u00E9finie comme l'esp\u00E9rance de l'information observ\u00E9e, ou encore comme la variance de la fonction de score. Dans le cas multi-param\u00E9trique, on parle de matrice d'information de Fisher."@fr . . . "\u6570\u7406\u7EDF\u8BA1\u5B66\u4E2D\uFF0C\u8D39\u5E0C\u5C14\u4FE1\u606F\uFF08\u82F1\u8BED\uFF1AFisher Information\uFF1B\u6709\u6642\u7A31\u4F5C information\uFF09\uFF0C\u6216\u7A31\u8CBB\u96EA\u8A0A\u606F\u6578\uFF0C\u901A\u5E38\u8BB0\u4F5C\uFF0C\u662F\u8861\u91CF\u89C2\u6D4B\u6240\u5F97\u7684\u968F\u673A\u53D8\u91CF\u643A\u5E26\u7684\u5173\u4E8E\u672A\u77E5\u6BCD\u6578\u7684\u8A0A\u606F\u91CF\uFF0C\u5176\u4E2D\u7684\u6982\u7387\u5206\u5E03\u4F9D\u8D56\u4E8E\u6BCD\u6578\u3002\u8D39\u5E0C\u5C14\u4FE1\u606F\u7531\u7EDF\u8BA1\u5B66\u5BB6\u7F57\u7EB3\u5FB7\u00B7\u8D39\u5E0C\u5C14\u5728\u5F17\u6717\u897F\u65AF\u00B7\u4F0A\u897F\u5FB7\u7F57\u00B7\u57C3\u5947\u6C83\u601D\u5DE5\u4F5C\u7684\u57FA\u7840\u4E0A\u63D0\u51FA\uFF0C\u73B0\u5E38\u7528\u4E8E\u6700\u5927\u4F3C\u7136\u4F30\u8BA1\u548C\u4E2D\u3002"@zh . . . . . . "\u0423 \u043C\u0430\u0442\u0435\u043C\u0430\u0442\u0438\u0447\u043D\u0456\u0439 \u0441\u0442\u0430\u0442\u0438\u0441\u0442\u0438\u0446\u0456 \u0442\u0430 \u0442\u0435\u043E\u0440\u0456\u0457 \u0456\u043D\u0444\u043E\u0440\u043C\u0430\u0446\u0456\u0457 \u0456\u043D\u0444\u043E\u0440\u043C\u0430\u0446\u0456\u0454\u044E \u0437\u0430 \u0424\u0456\u0448\u0435\u0440\u043E\u043C \u043D\u0430\u0437\u0438\u0432\u0430\u0454\u0442\u044C\u0441\u044F \u043C\u0456\u0440\u0430 \u043A\u0456\u043B\u044C\u043A\u043E\u0441\u0442\u0456 \u0456\u043D\u0444\u043E\u0440\u043C\u0430\u0446\u0456\u0457, \u0449\u043E \u0441\u043F\u043E\u0441\u0442\u0435\u0440\u0435\u0436\u0443\u0432\u0430\u043D\u0430 \u0432\u0438\u043F\u0430\u0434\u043A\u043E\u0432\u0430 \u0437\u043C\u0456\u043D\u043D\u0430 X \u043D\u0435\u0441\u0435 \u043F\u0440\u043E \u043D\u0435\u0432\u0456\u0434\u043E\u043C\u0438\u0439 \u043F\u0430\u0440\u0430\u043C\u0435\u0442\u0440 \u03B8, \u0432\u0456\u0434 \u044F\u043A\u043E\u0433\u043E \u0437\u0430\u043B\u0435\u0436\u0438\u0442\u044C \u0439\u043C\u043E\u0432\u0456\u0440\u043D\u0456\u0441\u0442\u044C X. \u0424\u043E\u0440\u043C\u0430\u043B\u044C\u043D\u043E \u0446\u0435 \u0434\u0438\u0441\u043F\u0435\u0440\u0441\u0456\u044F \u0444\u0443\u043D\u043A\u0446\u0456\u0457 \u0432\u043D\u0435\u0441\u043A\u0443 \u0432\u0438\u0431\u0456\u0440\u043A\u0438. \u0426\u044F \u0444\u0443\u043D\u043A\u0446\u0456\u044F \u043D\u0430\u0437\u0432\u0430\u043D\u0430 \u043D\u0430 \u0447\u0435\u0441\u0442\u044C \u0420\u043E\u043D\u0430\u043B\u044C\u0434\u0430 \u0424\u0456\u0448\u0435\u0440\u0430, \u0449\u043E \u043E\u043F\u0438\u0441\u0430\u0432 \u0457\u0457."@uk . . . . . . . . . . . . . . . . "Fisherinformatie"@nl . . "\u30D5\u30A3\u30C3\u30B7\u30E3\u30FC\u60C5\u5831\u91CF\uFF08\u30D5\u30A3\u30C3\u30B7\u30E3\u30FC\u3058\u3087\u3046\u307B\u3046\u308A\u3087\u3046\u3001\u82F1: Fisher information\uFF09 \u306F\u3001\u7D71\u8A08\u5B66\u3084\u60C5\u5831\u7406\u8AD6\u3067\u767B\u5834\u3059\u308B\u91CF\u3067\u3001\u78BA\u7387\u5909\u6570\u304C\u6BCD\u6570\u306B\u95A2\u3057\u3066\u6301\u3064\u300C\u60C5\u5831\u300D\u306E\u91CF\u3092\u8868\u3059\u3002\u7D71\u8A08\u5B66\u8005\u306E\u30ED\u30CA\u30EB\u30C9\u30FB\u30D5\u30A3\u30C3\u30B7\u30E3\u30FC\u306B\u56E0\u3093\u3067\u540D\u4ED8\u3051\u3089\u308C\u305F\u3002"@ja . . "Die Fisher-Information (benannt nach dem Statistiker Ronald Fisher) ist eine Kenngr\u00F6\u00DFe aus der mathematischen Statistik, die f\u00FCr eine Familie von Wahrscheinlichkeitsdichten definiert werden kann und Aussagen \u00FCber die bestm\u00F6gliche Qualit\u00E4t von Parametersch\u00E4tzungen in diesem Modell liefert.Die Fisher-Information spielt in der asymptotischen Theorie der Maximum-Likelihood-Sch\u00E4tzung eine wichtige Rolle und wird auch in der Bayes-Statistik bei der Berechnung von Priorverteilungen verwendet. Sie kann auch bei der Formulierung von Teststatistiken, wie beim Wald-Test verwendet werden."@de . . . . . . . "\u30D5\u30A3\u30C3\u30B7\u30E3\u30FC\u60C5\u5831\u91CF\uFF08\u30D5\u30A3\u30C3\u30B7\u30E3\u30FC\u3058\u3087\u3046\u307B\u3046\u308A\u3087\u3046\u3001\u82F1: Fisher information\uFF09 \u306F\u3001\u7D71\u8A08\u5B66\u3084\u60C5\u5831\u7406\u8AD6\u3067\u767B\u5834\u3059\u308B\u91CF\u3067\u3001\u78BA\u7387\u5909\u6570\u304C\u6BCD\u6570\u306B\u95A2\u3057\u3066\u6301\u3064\u300C\u60C5\u5831\u300D\u306E\u91CF\u3092\u8868\u3059\u3002\u7D71\u8A08\u5B66\u8005\u306E\u30ED\u30CA\u30EB\u30C9\u30FB\u30D5\u30A3\u30C3\u30B7\u30E3\u30FC\u306B\u56E0\u3093\u3067\u540D\u4ED8\u3051\u3089\u308C\u305F\u3002"@ja . . . . . "In de wiskundige statistiek is de fisherinformatie van een familie kansdichtheden een grootheid die informatie geeft over de kwaliteit van parameterschattingen. De grootheid is genoemd naar de Britse statisticus Ronald Aylmer Fisher."@nl . . "\uD53C\uC154 \uC815\uBCF4"@ko . . . . "\u0418\u043D\u0444\u043E\u0440\u043C\u0430\u0301\u0446\u0438\u044F \u0424\u0438\u0301\u0448\u0435\u0440\u0430 \u2014 \u043C\u0430\u0442\u0435\u043C\u0430\u0442\u0438\u0447\u0435\u0441\u043A\u043E\u0435 \u043E\u0436\u0438\u0434\u0430\u043D\u0438\u0435 \u043A\u0432\u0430\u0434\u0440\u0430\u0442\u0430 \u043E\u0442\u043D\u043E\u0441\u0438\u0442\u0435\u043B\u044C\u043D\u043E\u0439 \u0441\u043A\u043E\u0440\u043E\u0441\u0442\u0438 \u0438\u0437\u043C\u0435\u043D\u0435\u043D\u0438\u044F \u0443\u0441\u043B\u043E\u0432\u043D\u043E\u0439 \u043F\u043B\u043E\u0442\u043D\u043E\u0441\u0442\u0438 \u0432\u0435\u0440\u043E\u044F\u0442\u043D\u043E\u0441\u0442\u0438 . \u042D\u0442\u0430 \u0444\u0443\u043D\u043A\u0446\u0438\u044F \u043D\u0430\u0437\u0432\u0430\u043D\u0430 \u0432 \u0447\u0435\u0441\u0442\u044C \u043E\u043F\u0438\u0441\u0430\u0432\u0448\u0435\u0433\u043E \u0435\u0451 \u0420\u043E\u043D\u0430\u043B\u044C\u0434\u0430 \u0424\u0438\u0448\u0435\u0440\u0430."@ru . . . "\u0418\u043D\u0444\u043E\u0440\u043C\u0430\u0301\u0446\u0438\u044F \u0424\u0438\u0301\u0448\u0435\u0440\u0430 \u2014 \u043C\u0430\u0442\u0435\u043C\u0430\u0442\u0438\u0447\u0435\u0441\u043A\u043E\u0435 \u043E\u0436\u0438\u0434\u0430\u043D\u0438\u0435 \u043A\u0432\u0430\u0434\u0440\u0430\u0442\u0430 \u043E\u0442\u043D\u043E\u0441\u0438\u0442\u0435\u043B\u044C\u043D\u043E\u0439 \u0441\u043A\u043E\u0440\u043E\u0441\u0442\u0438 \u0438\u0437\u043C\u0435\u043D\u0435\u043D\u0438\u044F \u0443\u0441\u043B\u043E\u0432\u043D\u043E\u0439 \u043F\u043B\u043E\u0442\u043D\u043E\u0441\u0442\u0438 \u0432\u0435\u0440\u043E\u044F\u0442\u043D\u043E\u0441\u0442\u0438 . \u042D\u0442\u0430 \u0444\u0443\u043D\u043A\u0446\u0438\u044F \u043D\u0430\u0437\u0432\u0430\u043D\u0430 \u0432 \u0447\u0435\u0441\u0442\u044C \u043E\u043F\u0438\u0441\u0430\u0432\u0448\u0435\u0433\u043E \u0435\u0451 \u0420\u043E\u043D\u0430\u043B\u044C\u0434\u0430 \u0424\u0438\u0448\u0435\u0440\u0430."@ru . . "Informacja Fishera \u2013 miara ilo\u015Bci informacji o jednym lub wielu nieznanych parametrach jak\u0105 niesie obserwowalna zwi\u0105zana z nimi zmienna losowa . Mo\u017Ce by\u0107 rozumiana jako \u015Brednia dok\u0142adno\u015B\u0107 oszacowania, jak\u0105 daje obserwacja danych \u2013 tj. warto\u015B\u0107 oczekiwana brzegowej wiarygodno\u015Bci estymatora parametru wzgl\u0119dem obserwacji danych W przypadku jednego parametru i zmiennej ci\u0105g\u0142ej, oraz przy za\u0142o\u017Ceniu okre\u015Blonego statystycznego modelu ich wzajemnej zale\u017Cno\u015Bci wyra\u017Ca j\u0105 r\u00F3wnanie: Jest to wi\u0119c druga pochodna, czyli pochodna gradientu funkcji prawdopodobie\u0144stwa, pozwalaj\u0105ca wyrazi\u0107 szybko\u015B\u0107 jego zmian przy jej maksimum. Innymi s\u0142owy, informacja Fishera opisuje jak bardzo rozk\u0142ad wiarygodno\u015Bci estymatora parametru wzgl\u0119dem obserwacji zmiennej losowej jest skupiony blisko maksimum, czyli jak\u0105 wariancj\u0105 si\u0119 cechuje. Dla por\u00F3wnania, entropia Shannona wyra\u017Ca globalny \u015Bredni przyrost informacji, jak\u0105 daje obserwacja danych, w estymatorze histogramowym przyjmuj\u0105c posta\u0107: Ronald Fisher opisa\u0142 informacj\u0119 Fishera tak\u017Ce jako wewn\u0119trzn\u0105 dok\u0142adno\u015B\u0107 krzywej b\u0142\u0119du (intrinsic accuracy of an error curve). W przypadku wielu parametr\u00F3w jej wynik ma posta\u0107 macierzy Hessego. Ma postaci zar\u00F3wno dla zmiennych ci\u0105g\u0142ych, jak i dyskretnych. Miara ta wyst\u0119puje w wielu obszarach matematyki, statystyki i teorii informacji, w szczeg\u00F3lno\u015Bci stanowi g\u0142\u00F3wn\u0105 cz\u0119\u015B\u0107 nier\u00F3wno\u015Bci Cram\u00E9ra-Rao. Zasada nieoznaczono\u015Bci Heisenberga mo\u017Ce by\u0107 traktowana jako szczeg\u00F3lny przypadek minimum Cram\u00E9ra-Rao, a oba wzory opieraj\u0105 si\u0119 o nier\u00F3wno\u015B\u0107 Cauchy\u2019ego-Schwarza. Entropi\u0119 Shannona i informacj\u0119 Fishera, oraz inne miary informacji \u0142\u0105czy to\u017Csamo\u015B\u0107 de Bruijna i dywergencja Kullbacka-Leiblera."@pl . . . . . . . . . . . . . "1121357221"^^ . . "Die Fisher-Information (benannt nach dem Statistiker Ronald Fisher) ist eine Kenngr\u00F6\u00DFe aus der mathematischen Statistik, die f\u00FCr eine Familie von Wahrscheinlichkeitsdichten definiert werden kann und Aussagen \u00FCber die bestm\u00F6gliche Qualit\u00E4t von Parametersch\u00E4tzungen in diesem Modell liefert.Die Fisher-Information spielt in der asymptotischen Theorie der Maximum-Likelihood-Sch\u00E4tzung eine wichtige Rolle und wird auch in der Bayes-Statistik bei der Berechnung von Priorverteilungen verwendet. Sie kann auch bei der Formulierung von Teststatistiken, wie beim Wald-Test verwendet werden."@de . . . . . "\uD1B5\uACC4\uD559\uC5D0\uC11C \uD53C\uC154 \uC815\uBCF4(\uC601\uC5B4: Fisher information)\uB294 \uC5B4\uB5A4 \uD655\uB960\uBCC0\uC218\uC758 \uAD00\uCE21\uAC12\uC73C\uB85C\uBD80\uD130, \uD655\uB960\uBCC0\uC218\uC758 \uBD84\uD3EC\uC758 \uB9E4\uAC1C\uBCC0\uC218\uC5D0 \uB300\uD574 \uC720\uCD94\uD560 \uC218 \uC788\uB294 \uC815\uBCF4\uC758 \uC591\uC774\uB2E4."@ko . "Informacja Fishera \u2013 miara ilo\u015Bci informacji o jednym lub wielu nieznanych parametrach jak\u0105 niesie obserwowalna zwi\u0105zana z nimi zmienna losowa . Mo\u017Ce by\u0107 rozumiana jako \u015Brednia dok\u0142adno\u015B\u0107 oszacowania, jak\u0105 daje obserwacja danych \u2013 tj. warto\u015B\u0107 oczekiwana brzegowej wiarygodno\u015Bci estymatora parametru wzgl\u0119dem obserwacji danych W przypadku jednego parametru i zmiennej ci\u0105g\u0142ej, oraz przy za\u0142o\u017Ceniu okre\u015Blonego statystycznego modelu ich wzajemnej zale\u017Cno\u015Bci wyra\u017Ca j\u0105 r\u00F3wnanie:"@pl . . "\u8D39\u5E0C\u5C14\u4FE1\u606F"@zh . "Information de Fisher"@fr . . . . . . . . . . . . . . . . . . . . "Fisher information"@en . . . . . . . . . . . "In mathematical statistics, the Fisher information (sometimes simply called information) is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter \u03B8 of a distribution that models X. Formally, it is the variance of the score, or the expected value of the observed information. In Bayesian statistics, the asymptotic distribution of the posterior mode depends on the Fisher information and not on the prior (according to the Bernstein\u2013von Mises theorem, which was anticipated by Laplace for exponential families). The role of the Fisher information in the asymptotic theory of maximum-likelihood estimation was emphasized by the statistician Ronald Fisher (following some initial results by Francis Ysidro Edgeworth). The Fisher information is also used in the calculation of the Jeffreys prior, which is used in Bayesian statistics. The Fisher information matrix is used to calculate the covariance matrices associated with maximum-likelihood estimates. It can also be used in the formulation of test statistics, such as the Wald test. Statistical systems of a scientific nature (physical, biological, etc.) whose likelihood functions obey shift invariance have been shown to obey maximum Fisher information. The level of the maximum depends upon the nature of the system constraints."@en . . . . . "\u0406\u043D\u0444\u043E\u0440\u043C\u0430\u0446\u0456\u044F \u0437\u0430 \u0424\u0456\u0448\u0435\u0440\u043E\u043C"@uk . . . . . "598971"^^ . "\u0418\u043D\u0444\u043E\u0440\u043C\u0430\u0446\u0438\u044F \u0424\u0438\u0448\u0435\u0440\u0430"@ru . . . . . "L'information de Fisher est une notion de statistique introduite par R.A. Fisher qui quantifie l'information relative \u00E0 un param\u00E8tre contenue dans une distribution. Elle est d\u00E9finie comme l'esp\u00E9rance de l'information observ\u00E9e, ou encore comme la variance de la fonction de score. Dans le cas multi-param\u00E9trique, on parle de matrice d'information de Fisher."@fr . . . . "\uD1B5\uACC4\uD559\uC5D0\uC11C \uD53C\uC154 \uC815\uBCF4(\uC601\uC5B4: Fisher information)\uB294 \uC5B4\uB5A4 \uD655\uB960\uBCC0\uC218\uC758 \uAD00\uCE21\uAC12\uC73C\uB85C\uBD80\uD130, \uD655\uB960\uBCC0\uC218\uC758 \uBD84\uD3EC\uC758 \uB9E4\uAC1C\uBCC0\uC218\uC5D0 \uB300\uD574 \uC720\uCD94\uD560 \uC218 \uC788\uB294 \uC815\uBCF4\uC758 \uC591\uC774\uB2E4."@ko . "In de wiskundige statistiek is de fisherinformatie van een familie kansdichtheden een grootheid die informatie geeft over de kwaliteit van parameterschattingen. De grootheid is genoemd naar de Britse statisticus Ronald Aylmer Fisher."@nl . . . . . . "In mathematical statistics, the Fisher information (sometimes simply called information) is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter \u03B8 of a distribution that models X. Formally, it is the variance of the score, or the expected value of the observed information. The Fisher information matrix is used to calculate the covariance matrices associated with maximum-likelihood estimates. It can also be used in the formulation of test statistics, such as the Wald test."@en . . . . "In statistica e teoria dell'informazione, l'informazione di Fisher \u00E8 la varianza dello score (derivata logaritmica) associato a una data funzione di verosimiglianza. L'informazione di Fisher, che prende il nome dal celebre genetista e statistico Ronald Fisher, pu\u00F2 essere interpretata come l'ammontare di informazione contenuta da una variabile casuale osservabile , concernente un parametro non osservabile , da cui dipende la distribuzione di probabilit\u00E0 di . dove denota la funzione di verosimiglianza. Una scrittura equivalente \u00E8:"@it . . . . . . "Informazione di Fisher"@it . . . . . . "Fisher-Information"@de . . "\u0423 \u043C\u0430\u0442\u0435\u043C\u0430\u0442\u0438\u0447\u043D\u0456\u0439 \u0441\u0442\u0430\u0442\u0438\u0441\u0442\u0438\u0446\u0456 \u0442\u0430 \u0442\u0435\u043E\u0440\u0456\u0457 \u0456\u043D\u0444\u043E\u0440\u043C\u0430\u0446\u0456\u0457 \u0456\u043D\u0444\u043E\u0440\u043C\u0430\u0446\u0456\u0454\u044E \u0437\u0430 \u0424\u0456\u0448\u0435\u0440\u043E\u043C \u043D\u0430\u0437\u0438\u0432\u0430\u0454\u0442\u044C\u0441\u044F \u043C\u0456\u0440\u0430 \u043A\u0456\u043B\u044C\u043A\u043E\u0441\u0442\u0456 \u0456\u043D\u0444\u043E\u0440\u043C\u0430\u0446\u0456\u0457, \u0449\u043E \u0441\u043F\u043E\u0441\u0442\u0435\u0440\u0435\u0436\u0443\u0432\u0430\u043D\u0430 \u0432\u0438\u043F\u0430\u0434\u043A\u043E\u0432\u0430 \u0437\u043C\u0456\u043D\u043D\u0430 X \u043D\u0435\u0441\u0435 \u043F\u0440\u043E \u043D\u0435\u0432\u0456\u0434\u043E\u043C\u0438\u0439 \u043F\u0430\u0440\u0430\u043C\u0435\u0442\u0440 \u03B8, \u0432\u0456\u0434 \u044F\u043A\u043E\u0433\u043E \u0437\u0430\u043B\u0435\u0436\u0438\u0442\u044C \u0439\u043C\u043E\u0432\u0456\u0440\u043D\u0456\u0441\u0442\u044C X. \u0424\u043E\u0440\u043C\u0430\u043B\u044C\u043D\u043E \u0446\u0435 \u0434\u0438\u0441\u043F\u0435\u0440\u0441\u0456\u044F \u0444\u0443\u043D\u043A\u0446\u0456\u0457 \u0432\u043D\u0435\u0441\u043A\u0443 \u0432\u0438\u0431\u0456\u0440\u043A\u0438. \u0426\u044F \u0444\u0443\u043D\u043A\u0446\u0456\u044F \u043D\u0430\u0437\u0432\u0430\u043D\u0430 \u043D\u0430 \u0447\u0435\u0441\u0442\u044C \u0420\u043E\u043D\u0430\u043B\u044C\u0434\u0430 \u0424\u0456\u0448\u0435\u0440\u0430, \u0449\u043E \u043E\u043F\u0438\u0441\u0430\u0432 \u0457\u0457."@uk . . . . . . . . . . . . . . . . . "\u30D5\u30A3\u30C3\u30B7\u30E3\u30FC\u60C5\u5831\u91CF"@ja . . .