Hasil Pencarian  ::  Simpan CSV :: Kembali

Hasil Pencarian

Ditemukan 8333 dokumen yang sesuai dengan query
cover
Ipsen, Ilse C.F.
"This self-contained textbook presents matrix analysis in the context of numerical computation with numerical conditioning of problems and numerical stability of algorithms at the forefront. Using a unique combination of numerical insight and mathematical rigor, it advances readers understanding of two phenomena: sensitivity of linear systems and least squares problems, and numerical stability of algorithms."
Philadelphia: Society for Industrial and Applied Mathematics, 2009
e20450973
eBooks  Universitas Indonesia Library
cover
"The method of least squares was discovered by Gauss in 1795. It has since become the principal tool to reduce the influence of errors when fitting models to given observations. Today, applications of least squares arise in a great number of scientific areas, such as statistics, geodetics, signal processing, and control.
In the last 20 years there has been a great increase in the capacity for automatic data capturing and computing. Least squares problems of large size are now routinely solved. Tremendous progress has been made in numerical methods for least squares problems, in particular for generalized and modified least squares problems and direct and iterative methods for sparse problems. Until now there has not been a monograph that covers the full spectrum of relevant problems and methods in least squares.
This volume gives an in-depth treatment of topics such as methods for sparse least squares problems, iterative methods, modified least squares, weighted problems, and constrained and regularized problems. The more than 800 references provide a comprehensive survey of the available literature on the subject."
Philadelphia : Society for Industrial and Applied Mathematics, 1996
e20443156
eBooks  Universitas Indonesia Library
cover
Huffel, Sabine van
"This is the first book devoted entirely to total least squares. The authors give a unified presentation of the TLS problem. A description of its basic principles are given, the various algebraic, statistical and sensitivity properties of the problem are discussed, and generalizations are presented. Applications are surveyed to facilitate uses in an even wider range of applications. Whenever possible, comparison is made with the well-known least squares methods.
A basic knowledge of numerical linear algebra, matrix computations, and some notion of elementary statistics is required of the reader; however, some background material is included to make the book reasonably self-contained."
Philadelphia: Society for Industrial and Applied Mathematics, 1991
e20451176
eBooks  Universitas Indonesia Library
cover
Lawson, Charles L.
"The material covered includes Householder and Givens orthogonal transformations, the QR and SVD decompositions, equality constraints, solutions in nonnegative variables, banded problems, and updating methods for sequential estimation. Both the theory and practical algorithms are included. The easily understood explanations and the appendix providing a review of basic linear algebra make the book accessible for the non-specialist."
Philadelphia: Society for Industrial and Applied Mathematics, 1995
e20443248
eBooks  Universitas Indonesia Library
cover
Rizqa Fatika Fajrianti
"Prinsip parsimoni adalah prinsip yang menyatakan bahwa jika terdapat beberapa penjelasan untuk suatu fenomena, maka penjelasan paling sederhanalah yang harus dipilih. Prinsip ini digunakan dalam analisis data untuk memilih model yang paling efisien dalam menjelaskan variabilitas data dengan parameter seminimal mungkin. Namun pada beberapa kondisi, data bisa saja melibatkan pengukuran atau variabel yang cukup banyak. Data berdimensi tinggi dapat menyebabkan kompleksitas dan kesulitan dalam analisis, sehingga reduksi dimensi pada data penting untuk dilakukan. Principal Component Analysis (PCA) adalah salah satu metode yang dapat digunakan untuk melakukan reduksi dimensi, dengan mengekstraksi variabel baru dan mengurangi pengaruh dari variabel yang tidak relevan. Namun, metode PCA tidak toleran terhadap missing value, sehingga algoritma Nonlinear Iterative Partial Least Squares (NIPALS) dapat digunakan dalam mengatasi data yang mengandung missing value. Performa dari algoritma NIPALS dievaluasi menggunakan nilai normalized root mean square error (NRMSE) dan koefisien korelasi Pearson. Kemudian, performa dari algoritma ini dibandingkan dengan dua metode lain, meliputi Probabilistic Principal Component Analysis (PPCA) dan SVDImpute. Setelah dilakukan percobaan sebanyak seratus kali pada data survei COVIDiSTRESS, didapatkan hasil bahwa algoritma NIPALS memiliki performa yang lebih baik dan stabil dalam melakukan reduksi dimensi dibandingkan SVDImpute dan PPCA pada data dengan missing value sebesar 1% hingga 15%.

The principle of parsimony, states that if there are multiple explanations for a phenomenon, the simplest explanation should be chosen. This principle is applied in data analysis to select the most efficient model that explains the variability of the data with minimal parameters. However, in some cases, the data may involve a large number of measurements or variables. High-dimensional data can lead to complexity and difficulties in analysis, therefore dimensionality reduction is important. Principal Component Analysis (PCA) is one method that can be used for dimensionality reduction by extracting new variables and reducing the influence of irrelevant variables. However, PCA is not tolerant to missing values, so the Nonlinear Iterative Partial Least Squares (NIPALS) algorithm can be used to handle data with missing values. The performance of the NIPALS algorithm is evaluated using the normalized root mean square error (NRMSE) and Pearson correlation coefficient. Subsequently, the performance of this algorithm is compared with two other methods, including Probabilistic Principal Component Analysis (PPCA) and SVDImpute. After conducting a hundred trials on the COVIDiSTRESS survey data, it was found that the NIPALS algorithm performed better and was more stable in dimension reduction compared to SVDImpute and PPCA algorithms on data with missing values ranging from 1% to 15%."
Depok: Fakultas Matematika dan Ilmu Pengetahuan Alam Universitas Indonesia, 2023
S-pdf
UI - Skripsi Membership  Universitas Indonesia Library
cover
Erma Harviani
"Dalam penerapan analisis regresi seringkali terdapat efek dependensi spasial (lokasi) yaitu nilai observasi variabel dependen pada suatu lokasi bergantung pada nilai observasi di lokasi lain. Karateristik ini dinamakan spasial lag. Bentuk dependensi lain adalah spasial error yaitu error pada suatu lokasi dipengaruhi oleh error pada lokasi sekitarnya. Model regresi yang melibatkan efek dependensi spasial disebut model spasial dependen. Dalam kenyataannya tidak tertutup kemungkinan spasial dependen pada data cross section memiliki kedua kararateristik dependensi spasial. Tugas akhir ini membahas tentang prosedur mengestimasi parameter model dengan kedua jenis spasial dependen, yaitu spasial lag sekaligus spasial error dengan metode Generalized Spatial Two Stage Least Squares (GS2SLS). Metode ini menggunakan Two Stage Least Squares, Generalized Moment, dan transformasi Cochrane-Orcutt. Taksiran yang dihasilkan bersifat konsisten. Kata Kunci: Spasial Lag, Spasial Error, Two Stage Least Squares, Generalized Moment, Transformasi Cochrane-Orcutt."
Depok: Universitas Indonesia, 2008
S27774
UI - Skripsi Open  Universitas Indonesia Library
cover
Rimbun Budiman
"Data Panel merupakan kombinasi dua jenis data yaitu data cross section dan data time series. Tujuan dari penulisan tugas akhir ini adalah mencari taksiran parameter pada model regresi untuk data panel yang tidak lengkap (incomplete panel data regression models) dengan komponen error satu arah (one-way error component). Selain itu model regresi tersebut merupakan random effect models, yang berarti perbedaan karakteristik individu dan waktu diakomodasikan pada komponen error dari model.
Metode yang digunakan untuk menaksir parameter adalah metode Feasible Generalized Least Squares (FGLS). Pada metode tersebut, matriks kovarians error tidak diketahui, sehingga perlu dilakukan penaksiran terhadap komponen variansi yang terdapat pada matriks kovarians error tersebut. Metode yang digunakan untuk menaksir komponen variansi adalah modifikasi metode penaksiran ANOVA yang diusulkan oleh Wallace dan Hussain."
Depok: Fakultas Matematika dan Ilmu Pengetahuan Alam Universitas Indonesia, 2008
S-Pdf
UI - Skripsi Membership  Universitas Indonesia Library
cover
Davis, Timothy A.
"Computational scientists often encounter problems requiring the solution of sparse systems of linear equations. Attacking these problems efficiently requires an in-depth knowledge of the underlying theory, algorithms, and data structures found in sparse matrix software libraries. Here, Davis presents the fundamentals of sparse matrix algorithms to provide the requisite background. The book includes CSparse, a concise downloadable sparse matrix package that illustrates the algorithms and theorems presented in the book and equips readers with the tools necessary to understand larger and more complex software packages.
With a strong emphasis on MATLAB and the C programming language, Direct Methods for Sparse Linear Systems equips readers with the working knowledge required to use sparse solver packages and write code to interface applications to those packages. The book also explains how MATLAB performs its sparse matrix computations."
Philadelphia : Society for Industrial and Applied Mathematics, 2006
e20442876
eBooks  Universitas Indonesia Library
cover
Paramita Ayu Pawestri
"Partial Least Squares Regression adalah salah satu teknik regresi yang memerhatikan pola hubungan antara variabel respon dan variabel prediktor. Teknik tersebut dapat digunakan saat terdapat korelasi tinggi antara variabel prediktor, banyaknya variabel prediktor yang melebihi jumlah observasi dan efek random pada variabel prediktor. PLSR dengan menggunakan algoritma NIPALS, membentuk komponen yang merupakan kombinasi linier berbobot dari variabel prediktor yang digunakan untuk memprediksi variabel respon dengan metode Ordinary Least Squares, dimana komponen yang terbentuk ortogonal atau tidak saling berkorelasi dan banyaknya komponen yang terbentuk akan lebih sedikit dari banyaknya variabel prediktor.

Partial Least Squares Regression is one of technique that takes into account the pattern of relationship between response variable and predictor variables. The technique can be used when there is high correlation between predictors variables, the number of predictors variables exceed the number of observation and random effects on predictor variables. PLS using NIPALS algorithm, which is component forming a weigthed linear combination of predictor variables use to predict response variable by the method of Ordinary Least Squares, in which the component are formed orthogonal or not correlated each other and the number will be fewer than the number of predictor variables."
Depok: Fakultas Matematika dan Ilmu Pengetahuan Alam Universitas Indonesia, 2012
S43661
UI - Skripsi Open  Universitas Indonesia Library
cover
Pennsylvania: Dowden, Hutchinson & Ross, 1977
519 LIN (1)
Buku Teks  Universitas Indonesia Library
<<   1 2 3 4 5 6 7 8 9 10   >>