just another foolish blog

19.7.14

Yıllardan sonra blogspot.

5:50 AM Posted by Eren Yağdıran No comments
Tekrar kısa kısa blog tutmaya karar verdim.
En son ne zaman blog tutmuşsun desen , yıllar oldu nerdeyse :) O yıllarda ben çok değiştim.Saçlarım döküldü , birim zamanda minimum 2x yaşlandım ve gözlerim de açıldı diyebilirim.Bu arada bu ülkeye olan inancımda oldukça azaldı , her gün gelişen bin bir türlü politik olaylar insanı canından bezdiriyor.ee napalım ? gitme zamanı gelmedi mi ?


15.2.14

Freedom

4:27 AM Posted by Eren Yağdıran No comments

Finally , it's time to leave...

16.11.13

undefined reference to `yywrap' compiling PAM

2:22 PM Posted by Eren Yağdıran , , , , , , , , No comments
If you get this error while compiling pam modules ,

be sure you have flex package as installed..

I was working on debian.

/home/eren/PAM/Linux-PAM-1.1.1/conf/pam_conv1/pam_conv_l.c:871: undefined reference to `yywrap'
collect2: error: ld returned 1 exit status
make[4]: *** [pam_conv1] Error 1
make[4]: Leaving directory `/home/eren/PAM/Linux-PAM-1.1.1/conf/pam_conv1'
make[3]: *** [all] Error 2
make[3]: Leaving directory `/home/eren/PAM/Linux-PAM-1.1.1/conf/pam_conv1'
make[2]: *** [all-recursive] Error 1
make[2]: Leaving directory `/home/eren/PAM/Linux-PAM-1.1.1/conf'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/home/eren/PAM/Linux-PAM-1.1.1'
make: *** [all] Error 2

i fix this problem by installing flex package in debian repository.

apt-get install flex

i hope , it also works for you.

14.11.13

24.8.13

[arch linux] Systemd-journald yüksek cpu kullanımı.

11:44 AM Posted by Eren Yağdıran , , , , , , , , , No comments
Eğer böyle bir sorunla karşılaşır iseniz , ekrana kafa atmadan önce dediklerimi yapın

journalctl --disk-usage yapıp , file systemde yaratılan journalların büyüklüğüne bakın.Eğer gb'larca değilse bu sayfayı kapatın.

Eğer disk-usage çok fazlaysa ,

/etc/systemd/journald.conf dosyasından

 SystemMaxUse=16M
 ForwardToSyslog=no

olarak değiştirin.

birde initramfs'de shutdown hook olarak ekleyin ki systemd journal clean-up'ları kapatırken de yapsın.Bunun için

/etc/mkinitcpio.conf dosyasından

HOOKS satısına "shutdown" ekleyin.

son olarak da

mkinitcpio -p linux

yazarak da initramfs'i tekrar oluşturun

mevcut journallarınızdan kurtulmak için de

find /var/log/journal -name "*.journal~" -exec rm {} \;

yazdınız mı , alın size cillop gibi arch dağıtımı.

25.10.12

Even Tree problem & my solution

5:54 PM Posted by Eren Yağdıran No comments

You are given a tree (a simple connected graph with no cycles).You have to remove as many edges from the tree as possible to obtain a forest with the condition that : Each connected component of the forest contains even number of vertices
Your task is to calculate the number of removed edges in such a forest.

Input:
The first line of input contains two integers N and M. N is the number of vertices and M is the number of edges. 2 <= N <= 100.
Next M lines contains two integers ui and vi which specifies an edge of the tree. (1-based index)

Output:
Print a single integer which is the answer
Sample Input 
10 9
2 1
3 1
4 3
5 2
6 1
7 2
8 6
9 8
10 8
 
Sample Output :
2
 
Explanation : On removing the edges (1, 3) and (1, 6), we can get the desired result.
Original tree:


Decomposed tree:

Note: The tree in the input will be such that it can always be decomposed into components containing even number of nodes. 


My solution is on pastebin - clickhere

18.10.12

Lets scape Google

8:34 AM Posted by Eren Yağdıran No comments
i just need to have couple of urls in google results

so instead of writing a pure http client for spoofing google , i write some junk of js files to fetch urls from google , without getting banned :)

I used firefox and greasemonkey plugin.

Basically , what this script does that ;


  1. Wait until google ajax request is loaded
  2. parse the body content to obtaion urls by using regular expressions
  3. after those expressions are evaluaded , i send those url to external website by using GET request
  4. After sending is completed , script generates a fake next click event that is for fetching next page.
  5. And again to step 1 until all results are finished
  6. lastly , modify the gonder function in order to adapt your needs


Check out my script here