Hasil Pencarian  ::  Simpan CSV :: Kembali

Hasil Pencarian

Ditemukan 28419 dokumen yang sesuai dengan query
cover
Arif Muntasa
"Facial detection is a crucial stage in the facial recognition process. Misclassification during the facial detection process will impact recognition results. In this research, windowing system facial detection using the Gabor kernel filter and the fast Fourier transform was proposed. The training set images, for both facial and non-facial images, were processed to obtain the local features by using the Gabor kernel filter and the fast Fourier transform. The local features were measured using probabilistic learning vector quantization. In this process, facial and non-facial features were classified using label 1 and -1. The proposed method was evaluated using facial and non-facial image testing sets, which were taken from the MIT+CMU image database. The testing images were enhanced first before the detection process using four different enhancement methods: histogram equalization, adaptive histogram equalization, contrast limited adaptive histogram equalization, and the single-scale retinex method. The detection results demonstrated that the highest average accuracy was 83.44%."
Depok: Faculty of Engineering, Universitas Indonesia, 2017
UI-IJTECH 8:1 (2017)
Artikel Jurnal  Universitas Indonesia Library
cover
Arif Muntasa
"Facial detection is a crucial stage in the facial recognition process. Misclassification during the facial detection process will impact recognition
results. In this research, windowing system facial detection using the Gabor
kernel filter and the fast Fourier transform was proposed. The training set
images, for both facial and non-facial images, were processed to obtain the
local features by using the Gabor kernel filter and the fast Fourier transform.
The local features were measured using probabilistic learning vector
quantization. In this process, facial and non-facial features were classified
using label 1 and -1. The proposed method was evaluated using facial and non-facial image testing sets, which
were taken from the MIT+CMU image database. The
testing images were enhanced first before the detection process using four
different enhancement methods: histogram equalization, adaptive histogram equalization,
contrast limited adaptive histogram equalization, and the single-scale retinex
method. The detection results demonstrated that the highest average accuracy
was 83.44%."
2017
J-Pdf
Artikel Jurnal  Universitas Indonesia Library
cover
Amanda Fairuz Syifa
"

Pertumbuhan penggunaan Windows 11 mendorong perlunya evaluasi terhadap sistem operasi ini. Meski merupakan pembaruan dari Windows 10, fokus utama tetap pada risiko keamanan karena meningkatnya serangan siber. Banyak serangan terjadi di tingkat endpoint, sehingga perlindungan pengguna dan data sangat penting. Penelitian ini mengevaluasi kerentanan keamanan dan potensi serangan pada Windows 11 Home dan Enterprise menggunakan metode Information System Security Assessment Framework (ISSAF). Hasilnya menunjukkan adanya kerentanan signifikan pada protokol SMB dan RDP, dengan Windows 11 Enterprise lebih rentan terhadap serangan tertentu seperti SMB Relay Attack. Risiko lain termasuk potensi instalasi backdoor. Rekomendasi mitigasi meliputi pengaktifan SMB Signing, kebijakan kata sandi kompleks, penonaktifan RDP jika tidak digunakan, dan pengaktifan antivirus. Penelitian ini memberikan wawasan berharga untuk meningkatkan keamanan Windows 11


The growth in Windows 11 usage necessitates an evaluation of this operating system. Despite being an update from Windows 10, the main focus remains on security risks due to the increasing complexity of cyber attacks. Many attacks occur at the endpoint level, making user and data protection crucial. This study evaluates security vulnerabilities and potential attacks on Windows 11 Home and Enterprise using the Information System Security Assessment Framework (ISSAF) method. The results show significant vulnerabilities in the SMB and RDP protocols, with Windows 11 Enterprise being more susceptible to certain attacks such as SMB Relay Attack. Other risks include potential backdoor installations. Recommended mitigations include enabling SMB Signing, implementing complex password policies, disabling RDP if not in use, and activating antivirus software. This research provides valuable insights for enhancing the security of Windows 11.

"
Depok: Fakultas Teknik Universitas Indonesia, 2024
S-pdf
UI - Skripsi Membership  Universitas Indonesia Library
cover
Rika
"ABSTRAK
Pada beberapa tahun terakhir, sistem pengenalan wajah telah marak digunakan dalam berbagai aspek sebagai wujud dari kemajuan teknologi. Berbagai penelitian dilakukan untuk terus memperbaiki akurasi dari pengenalan wajah. Pada penelitian ini digunakan metode klasifikasi Learning Vector Quantization dan Fuzzy Kernel Learning Vector Quantization. Data yang digunakan adalah Labeled Face in The Wild-a LFW-a. Database ini tidak memiliki batasan seperti latar belakang, ekspresi, posisi, dan sebagainya. Berdasarkan hasil uji coba menggunakan database LFW-a, sistem pengenalan wajah dengan metode LVQ memiliki akurasi tertinggi 89,33 dan metode FKLVQ memiliki akurasi tertinggi 89,33 pula.

ABSTRACT
In recent years, face recognition is widely used in various aspects as a form of technology advancement. Various studies are conducted to keep improving the accuracy of face recognition. In this research, Learning Vector Quantization and Fuzzy Kernel Learning Vector Quantization are used as a method of classification. The data used in this research is Labeled Face in The Wild a LFW a. This database has no restrictions such as background, expression, position, and so on. Based on test results using LFW a database, face recognition using LVQ method has highest accuracy at 89,33 and FKLVQ method has highest accuracy at 89,33 as well."
2018
S-Pdf
UI - Skripsi Membership  Universitas Indonesia Library
cover
Darien Jonathan
"ABSTRAK
Distribusi normal adalah salah satu jenis persebaran kelompok data yang didefinisikan berdasarkan rata-rata dan standar deviasi dari sekelompok data, yang dapat digunakan untuk mengelompokkan data berdasarkan posisinya terhadap standar deviasi dari kelompok data tersebut. Learning Vector Quantization adalah salah satu jenis neural network yang bisa mempelajari sendiri masukan yang ia terima kemudian memberi keluaran sesuai dengan masukan tersebut, dengan metode supervised dan competitive learning. Skripsi ini membahas penerapan dan analisis dari kedua sistem tersebut untuk menguji hasil deteksi plagiarisme oleh sistem deteksi plagiarisme berbasis latent semantic analysis, yang berasal dari program Simple-O. Beberapa modifikasi dilakukan untuk meningkatkan akurasi pengujian, antara lain dengan melakukan variasi parameter-parameter dari metode distribusi normal, yakni dengan mengubah batas standar deviasi maupun dengan mengubah koefisien pengali batas nilai pada standar deviasi tertentu, dimana hasilnya adalah standar deviasi maupun koefisien pengalinya berbanding lurus dengan aspek relevansi program (recall) namun tidak pada akurasi (F-Measure). Modifikasi juga dilakukan pada parameter percepatan belajar dari algoritma learning vector quantization, dimana hasilnya adalah parameter percepatan belajar berbanding terbalik dengan relevansi program maupun akurasi. Kemudian variasi dan analisis dilakukan pada tujuh jenis besaran hasil keluaran sistem deteksi plagiarisme berbasis latent semantic analysis, yakni frobenius norm, slice, dan pad, beserta kombinasinya, dimana hasilnya keberadaan frobenius norm diwajibkan untuk melakukan evaluasi kemiripan antara dua teks. Kemudian hasil pengujian menggunakan kedua metode digabungkan menggunakan operasi AND yang memberikan hasil yang beragam, dengan catatan perlunya keseimbangan antara precision dan recall dari masing pengujian yang akan dilakukan operasi AND untuk memberikan hasil yang baik. Dengan menggunakan kombinasi metode dan parameter yang tepat, terdapat peningkatan akurasi sistem dari 35-46% pada penelitian sebelumnya hingga maksimal 65,98%.

ABSTRACT
Normal distribution is a type of data distributions which is defined from the average and standard deviation of the data cluster. It can be used to group datas based on its position from the standard deviation of the data cluster. Learning vector quantization is a type of neural networks that can learn from inputs it gets to give appropriate outputs, with supervised and competitive learning methods. This thesis discusses the implementation and analysis of both methods to verify the plagiarism detection results from detection plagiarism system based on latent semantic analysis, which is based on Simple-O program. Some modifications are made, such as by variating the parameters of normal distribution method, by changing the limits of standard deviation or by changing the factor of the number limit at a particular standard deviation. Both of them appear to be directly proportional to the relevance (recall), but not with accuracy (F-Measure). Modifications are also made at the learning acceleration parameters from the learning vector quantization algorithm, which sees the parameters being inversely proportional to both the relevance and accuracy. Then, variations and analysis are done to seven types of magnitude from the results of the plagiarism detection system, which are frobenius norm, slice, and pad, and their combinations, which suggest that frobenius norm is the most verifiable results, and must be included to be evaluated when text similarity analysis are conducted. Then, verification results using both methods are combined using AND operation which gives diverse results. However, it is needed to have a balance between precision and recall from each verifications to produce good results. With correct combinations of methods and parameters, system accuracy are increased from 35-46% of last research to maximum accuracy of 65,98%.
"
Fakultas Teknik Universitas Indonesia, 2016
S62578
UI - Skripsi Membership  Universitas Indonesia Library
cover
Prag, John
London: British Museum Press, 1999
930.101 PRA m
Buku Teks SO  Universitas Indonesia Library
cover
Pinem, Josua Geovani
"Keamanan data (data security) sudah menjadi bagian vital didalam suatu organisasi yang menggunakan konsep sistem informasi. Semakin hari ancaman-ancaman yang datang dari Internet menjadi semakin berkembang hingga dapat mengelabuhi firewall maupun perangkat antivirus. Selain itu jumlah serangan yang masuk menjadi lebih besar dan semakin sulit untuk diolah oleh firewall maupun antivirus. Untuk dapat meningkatkan keamanan dari suatu sistem biasanya dilakukan penambahan Intrusion Detection Sistem IDS , baik sistem dengan kemampuan anomaly-based maupun sistem pendeteksi dengan kemampuan signature-based. Untuk dapat mengolah serangan yang jumlahnya besar maka digunakan teknik Big Data. Penelitian yang dilakukan ini menggunakan teknik anomaly-based dengan menggunakan Learning Vector Quantization dalam pendeteksian serangan.
Learning Vector Quantization adalah salah satu jenis neural network yang bisa mempelajari sendiri masukan yang masuk kemudian memberi keluaran sesuai dengan masukan tersebut. Beberapa modifikasi dilakukan untuk meningkatkan akurasi pengujian, antara lain dengan melakukan variasi parameter-parameter uji yang ada pada LVQ. Dengan melakukan variasi pada parameter uji learning rate, epoch dan k-fold cross validation dihasilkan keluaran dengan hasil yang lebih efisien.
Keluaran diperoleh dengan menghitung nilai information retrieval dari tabel confusion matrix tiap- tiap kelas serangan. Untuk meningkatkan kinerja sistem maka digunakan teknik Principal Component Analysis untuk mereduksi ukuran data. Dengan menggunakan 18-Principal Component data berhasil direduksi sebesar 47.3 dengan nilai Recognition Rate terbaik sebesar 96.52 dan efesiensi waktu lebih besar 43.16 daripada tanpa menggunakan PCA.

Data security has become a very serious part of any organizational information system. More and more threats across the Internet has evolved and capable to deceive firewall as well as antivirus software. In addition, the number of attacks become larger and become more dificult to be processed by the firewall or antivirus software. To improve the security of the system is usually done by adding Intrusion Detection System IDS , which divided into anomaly based detection and signature based detection. In this research to process a huge amount of data, Big Data technique is used. Anomaly based detection is proposed using Learning Vector Quantization Algorithm to detect the attacks.
Learning Vector Quantization is a neural network technique that learn the input itself and then give the appropriate output according to the input. Modifications were made to improve test accuracy by varying the test parameters that present in LVQ. Varying the learning rate, epoch and k fold cross validation resulted in a more efficient output.
The output is obtained by calculating the value of information retrieval from the confusion matrix table from each attack classes. Principal Component Analysis technique is used along with Learning Vector Quantization to improve system performance by reducing the data dimensionality. By using 18 Principal Component, dataset successfully reduced by 47.3 , with the best Recognition Rate of 96.52 and time efficiency improvement up to 43.16.
"
Depok: Fakultas Teknik Universitas Indonesia, 2017
S67412
UI - Skripsi Membership  Universitas Indonesia Library
cover
Scott, Richard P.
California: Ziff-Davis, 1995
005.43 SCO p
Buku Teks SO  Universitas Indonesia Library
cover
Sebastopol, CA: O`Reilly, 1990
005.43 XVI
Buku Teks SO  Universitas Indonesia Library
cover
Coakes, Sheridan J.
Sydney : John Wiley & Sons, 2001
005.36 COA s
Buku Teks SO  Universitas Indonesia Library
<<   1 2 3 4 5 6 7 8 9 10   >>