Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As you get deeper into it they hook you into the server and other stuff and it ends up costing 000s.

Pandas is better but requires programming.



I second pandas, and would also highlight seaborn, plotly, and dash as complimentary data visualization libraries.

Tableau is fine for what it is, but I've found that the requests from stakeholders often grow to a point where you either can't do it in Tableau or have to move mountains to get it to work... so, in essence, sunk cost fallacy makes tableau millions.


PSA: If you haven't already learned pandas, learn dplyr/ggplot2 instead. Yes, R is a pretty clunky language (but it's closer to lisp than pretty much anything else as popular) but ggplot2 and dplyr are 100% the best currently available way to visualise tables in SQL.


I would second the decision to use ggplot2 / dplyr, and would also add data.table to the mix. That combination has been invaluable for me, allowing me to visualize all of my structured data.


I started with R, but then switched to Python because all pipelines were already written in Python (web-scrapers, some data-processing scripts, REST APIs), so I just learned pandas and it's been fine, although I do think dplyr's syntax is great and I prefer it to pandas'.


Yeah, totally. I spend a lot more of my time writing Python for anything that isn't data exploration/analysis, for exactly that reason.

I still refuse to learn pandas enough to replace dplyr though, as it's just so painful to use the API compared to how easy this stuff is in R.


Do you load the whole table into memory though?


$70/mo includes a server (Tableau online)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: