HanG321 Blog
Be Shine, Be Smile, Be Wild
  • Home
  • Blog
    • 好文閱讀 readings
    • 生活記事 diary
    • 時事評論 commentary
    • 科技資訊 technology
    • 電腦編程 programming
    • 金融財經 finance
    • 音樂電影 music/movie
  • About

Hangover Part 3 / Blind Detective盲探 / Saving Mr. Banks

March 10, 2014|生活記事

Hangover Part 3
Haven’t watch Part 1 & 2 yet. scene like GTA. Good comedy movie.

Blind Detective 盲探
如果唔係合拍片會仲更加好, 聽住麥包把聲但係大陸演員, 格格不入….
出奇不意既係…

Saving Mr. Banks
IMDB 7.7, a bit boring and long movie. I guess I will enjoy more when I know / have read Mary Poppins.

logstash & elasticsearch

March 9, 2014|elasticsearch, logstash, NoSQL|電腦編程

Discovered logstash last year, finally I have played around logstash and elasticsearch during weekend, pretty easy to walk through Getting Started. However when I try to customize pattern for work-related log, its documentation does not come with good example and its API changed between 1.2 and 1.3… some search result from google using deprecated version. Anyway here is only document what I played so far.

Preparation:
Download a “flat” jar (logstash integrated with elasticsearch now): http://download.elasticsearch.org/logstash/logstash/logstash-1.3.3-flatjar.jar
required curl, if it does not installed yet: $ sudo apt-get install curl

Tasting:
– Walk through “Getting started with logstash (standalone server example)”
– For production, input would better be log4j rather than file, here just a test.

config file:

Since the log4j timestamp does not use ISO format, need to define it. see http://logstash.net/docs/1.3.3/filters/grok#patterns_dir
create a folder named “pattern”, create a file inside and save with follow content.

1
LOGDATETIME %{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{HOUR}:%{MINUTE}:%{SECOND}

test.conf
Ruby
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
input {
  stdin {
    type => "stdin-type"
  }
 
  file {
    type => "xmllog"
    # path must be absolute
    path => "/workspaces/logstatsh/log4j-xml.log"
  }
}
 
filter {
  grok {
    patterns_dir => "./patterns"
    # sample log: 2014-03-08 11:50:02.001 INFO [loggingHandler] (logger.java:123) - <?xml .....>
    match => [ "message", "%{LOGDATETIME:logDateTime}%{SPACE}%{LOGLEVEL:level}%{SPACE}\[%{DATA:thread}\]%{SPACE}\(%{JAVACLASS:class}:%{NUMBER:line}\)%{SPACE}-%{SPACE}(?<xmlData>.*$)" ]
 
    # remove fields don't care, e.g. line number
    remove_field => [ "line" ]
  }
 
  date {
    # matched pattern will replace "@timestamp" field to logDateTime (parsed above)
    match => ["logDateTime", "YYYY-MM-dd HH:mm:ss.SSS"]
  }
 
  xml {
    source => "xmlData"
    target => "parsedXml"
  }
}
 
output {
  stdout { debug => true }
  elasticsearch { embedded => true }
}

N.B. in xml filter, remember to define target => “xxxx”! Otherwise console debug logged xmlparsefailure and NoMethodException. With it, all first level children xml tags will be indexed and fields will be created automatically, no need to setup XPath.

okay, now run $ java -jar logstash-1.3.3-flatjar.jar agent -f test.conf -- web

By paste following line to ‘standard in’ or listened log file: (xml from w3school example)

Java
1
2014-03-09 19:01:03.005 INFO [loggingHandler] (logger.java:123) - <note><to>Tove</to><from>Jani</from><heading>Reminder</heading><body>Don't forget me this weekend!</body></note>

then, by debug=true, logstash shell console display information. And it will send to elasticsearch with following result.

Now go to http://localhost:9292/index.html#/dashboard/file/logstash.json to see the result!

elasticsearch JSON format
JavaScript
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
{
  "_index": "logstash-2014.03.09",
  "_type": "stdin-type",
  "_id": "94ycZEdfQDCF_c5PxP-zlg",
  "_score": null,
  "_source": {
    "message": "2014-03-09 19:01:03.005 INFO [loggingHandler] (logger.java:123) - <note><to>Tove</to><from>Jani</from><heading>Reminder</heading><body>Don't forget me this weekend!</body></note>",
    "@version": "1",
    "@timestamp": "2014-03-09T19:01:03.005+10:00",
    "type": "stdin-type",
    "host": "hang321-mintvm",
    "logDateTime": "2014-03-09 19:01:03.005",
    "level": "INFO",
    "thread": "loggingHandler",
    "class": "logger.java",
    "xmlData": "<note><to>Tove</to><from>Jani</from><heading>Reminder</heading><body>Don't forget me this weekend!</body></note>",
    "parsedXml": {
      "to": [
        "Tove"
      ],
      "from": [
        "Jani"
      ],
      "heading": [
        "Reminder"
      ],
      "body": [
        "Don't forget me this weekend!"
      ]
    }
  },
  "sort": [
    1394355663005,
    1394355663005
  ]
}

spotify offline mode

March 4, 2014|生活記事, 科技資訊

用左spotify 一段日子, 今日突然發現自己冇上網睇野都係有好多connections, 先知原來spotify 會用peer-to-peer 做load balancing.

ref:
https://support.spotify.com/au/learn-more/guides/#!/article/Listen-offline

...102021222324...50...

 

如果你喜歡我的文章,請幫忙按 1-10次 LikeButton 化讚為賞,非常感謝!越喜歡當然可以越按越多 😛

搜尋 Search

簡介 Bio

香港人,現居南十字星空下。

為人貪心,科技、生活、財經、散文 皆有興趣,周身刀冇張利。

思想矛盾,喜歡現在work-life balance 既生活又懷念a city never sleep。

 

每月送我一杯咖啡支持我: liker.land/hang321




分類 Categories

  • 好文閱讀
  • 時事評論
  • 未分類
  • 生活記事
  • 科技資訊
  • 金融財經
  • 電腦編程
  • 音樂電影

文章存檔 Archives




熱門文章 Popular Posts

  • Install XPEnology (DSM) 5.1 on ESXi 6 (HP MicroServer Gen 8)
    Install XPEnology (DSM) 5.1 on ESXi 6 (HP MicroServer Gen 8) June 8, 2015
  • 呢幾日個blogger 有問題….
    呢幾日個blogger 有問題…. October 28, 2004
  • assembly
    assembly February 11, 2006
  • 新工作
    新工作 January 6, 2009
  • 嫁人要嫁工程師
    嫁人要嫁工程師 April 27, 2006

標籤雲 Tag Cloud

CentOS Character chroot Cluster crash cryptography DD-WRT debug Domino DSM Dual Core DWA email ESXi GCP git google HylaFax IE Java Javascript JRE LikeCoin Linux log LotusScript mint MX MySQL nginx PKI PowerShell Qwiklabs srt telent VMware vpn vSphere WinXP wordpress XPEnology 專欄 網絡資訊 選股 風帆

日曆 Calendar

May 2025
M T W T F S S
  « Feb    
 1234
567891011
12131415161718
19202122232425
262728293031  

Follow Me

Follow Us on RSSFollow Us on TwitterFollow Us on YouTube

文章存檔 Archives

Copyright © 2004-2021 hang321.net. All Rights Reserved