Before I always use NEST'S bulk(), but now I have the geopoint type in my data ,the earlier topics  I already mentioned it ,I use geolocation to define the  location ,but it can't be convert to geopoint type. So I try  bulk API and they expects NDJSON,  I edit my data like:
{"index":{"_id":"1"}}
{"id":174,"name":"JET REPORTS DANMARK ApS","address":"Lyskær 3","areaCode":2730,"city":"Herlev","location":{"lat":55.7145620017324,"lon":12.4359915318663}}
{"index":{"_id":"2"}}
{"id":568,"name":"OUTOKUMPU A/S","address":"Alhambravej 3","areaCode":1826,"city":"Frederiksberg C","location":{"lat":55.6743134573089,"lon":12.5440667455454}}

I get success to import them to es and the location is geopoint type. But if it is necessary  every line must with {"index":{"_id":"1"}},{"index":{"_id":"2"}}?  I can't bulk them like:
{"id":174,"name":"JET REPORTS DANMARK ApS","address":"Lyskær 3","areaCode":2730,"city":"Herlev","location":{"lat":55.7145620017324,"lon":12.4359915318663}}
{"id":568,"name":"OUTOKUMPU A/S","address":"Alhambravej 3","areaCode":1826,"city":"Frederiksberg C","location":{"lat":55.6743134573089,"lon":12.5440667455454}}

---
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB