Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Importing CSV in Elasticsearch and display it in Kibana. #1310

Closed
SabareeshSS opened this issue Jun 16, 2014 · 7 comments
Closed

Importing CSV in Elasticsearch and display it in Kibana. #1310

SabareeshSS opened this issue Jun 16, 2014 · 7 comments

Comments

@SabareeshSS
Copy link

Hi,
I had imported a CSV file with 5000 lines of data into Elasticsearch using Logstash by creating a Logstash.conf file.

  1. I lost so many datas from my csv file. I am unable to figure out what are the datas which I left out.
  2. Those collected datas are displayed as a dashboard in Kibana. There I had found Kibana is displaying only 500 events. The data which displayed are the last 500.
    How can I recover the other data? and is there any data limitation in Kibana?
  3. Though I had imported my 1st csv and I had got the output which I had mentioned above. When I tried to import next csv with only 100 data items as the same way
    logstash.conf-->ES-->Kibana
    the latest csv is displayed, now my question is how do I get my data of the 1st csv?

I am a newbie in Log Centralization I am less aware to the work arounds.
Please help me.
My questions may not be structured because I have a lots of doubts in ELK Stack.
:)
untitled
My Logstash.conf file ::
input {
stdin {
type => "stdin-type"
}
file {
path => ["C:/tomcat/webapps/logstash-1.4.1/bin/var/ServiceConnect.csv"]
start_position => "beginning"
}
}

filter {
csv {
columns => ["Id","Stamp","Comments"]
separator => ","
}
}

output {

elasticsearch { host => localhost }
stdout { codec => rubydebug }
}

@thejaspm
Copy link

1)Use grok filter instead of csv filter, and check for lines which fail with "_grokparsefailure" in the tags. The once with grok parse failures are the one which are failing.
2)From your screen shot kibana is displaying 1505 events, 500 is the paginated data.
3) As long as you are pointing to the same Elastic search with persisted indexes this should not have happened.(Check if you have added any filter in your dashboard which is limiting the results)

@SabareeshSS
Copy link
Author

Hi thejaspm,
Thanks for your valuable comment, I have certain replies,

  1. Please help me how to import csv using grok filters.
  2. Its displaying 1505 events because I was zoomed in...
  3. I didn't added any filters to it.

By the way, I have elasticsearch which consists of 4 indices, I am unable to display the index which I want in kibana dashboard.
I had seen many screenshots in google all shows only 500 as the paginated data.. Is there any restriction in kibana regarding to this?
If so in what type of sort they are displayed?

Please help me..:)

@thejaspm
Copy link

  1. I can provide few links with examples of using grok filter, Best is you google it and try experimenting in a sandbox setup.
    https://home.regit.org/2014/01/a-bit-of-logstash-cooking/, grok debugger - http://grokdebug.herokuapp.com/

You can configure the Index in the "configure Dashboard" option, look for a "gear icon" on the page top

For setting the pagination limit go to "All events" section . open the configure menu and change the limit. "500" is just the default value.

@SabareeshSS
Copy link
Author

Hi thejaspm,

Thanks a lot!

Actually I had tried all.. I was just a step behind. What I did is I forgot to REFRESH the kibana..:)

lol I found.

Those links are really helpful.

May be out off topic(issue) :
Do you help me What are the things we can do with this ELK stack?
I mean functionalities..

Its will be helpful for me..

Many Thanks :)

@rashidkpc
Copy link
Contributor

Closing this since it is not a bug

@SabareeshSS
Copy link
Author

Hi Rashid..
Can we able to do Custom Analytics? I need your help. How do I reach you??

@alapati96
Copy link

how can i upload csv file to elastic search without the logstash file?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants