background preloader

Slightly skeptical) Open Source Software Educational Society

Slightly skeptical) Open Source Software Educational Society
Related:  Open Source

index Welcome to the DARPA Open Catalog, which contains a curated list of DARPA-sponsored software and peer-reviewed publications. DARPA sponsors fundamental and applied research in a variety of areas including data science, cyber, anomaly detection, etc., which may lead to experimental results and reusable technology designed to benefit multiple government domains. The DARPA Open Catalog organizes publicly releasable material from DARPA programs. DARPA has an open strategy to help increase the impact of government investments. DARPA is interested in building communities around government-funded software and research. The table on this page lists the programs currently participating in the catalog. Program Manager: Dr. Report a problem: opencatalog@darpa.mil

Open-source Weave liberates data for journalists, citizens Data nerds from government and academia gathered Friday at Northeastern University to show off the latest version of Weave, an open-source, web-based platform designed to visualize “any available data by anyone for any purpose.” The software has a lot of potential for journalists. Weave is supported by the Open Indicators Consortium, an unusual partnership of planning agencies and universities who wanted better tools to inform public policy and community decision-making. The groups organized and agreed to share data and code in 2008, well before Gov 2.0 was hot. Think of Weave as more programming language than app. Data is linked, which means you can view the same datapoint from many angles. The software reminds me of SPSS, from my college poli sci days. Georges Grinstein, a professor of computer science at UMass Lowell, develops Weave with a team of some 20 students.

Software development methodology A software development methodology or system development methodology in software engineering is a framework that is used to structure, plan, and control the process of developing an information system. Common methodologies include waterfall, prototyping, iterative and incremental development, spiral development, rapid application development, and extreme programming. A methodology can also include aspects of the development environment (i.e. IDEs), model-based development, computer aided software development, and the utilization of particular frameworks (i.e. programming libraries or other tools). History[edit] The software development methodology (also known as SDM) framework didn't emerge until the 1960s. As a framework[edit] The three basic approaches applied to software development methodology frameworks. A wide variety of such frameworks have evolved over the years, each with its own recognized strengths and weaknesses. As an approach[edit] 1970s 1980s 1990s Approaches[edit]

Anatomy of the Linux kernel History and architectural decomposition M. Tim JonesPublished on June 06, 2007 Given that the goal of this article is to introduce you to the Linux kernel and explore its architecture and major components, let's start with a short tour of Linux kernel history, then look at the Linux kernel architecture from 30,000 feet, and, finally, examine its major subsystems. The Linux kernel is over six million lines of code, so this introduction is not exhaustive. Use the pointers to more content to dig in further. A short tour of Linux history Linux or GNU/Linux? You've probably noticed that Linux as an operating system is referred to in some cases as "Linux" and in others as "GNU/Linux." While Linux is arguably the most popular open source operating system, its history is actually quite short considering the timeline of operating systems. Twenty years later, Andrew Tanenbaum created a microkernel version of UNIX®, called MINIX (for minimal UNIX), that ran on small personal computers. Figure 1.

POLITICA DEI SERVIZI SOCIALI Best Coding Practices Best coding practices are a set of informal rules that the software development community has learned over time which can help improve the quality of software.[1] Many computer programs remain in use for far longer than the original authors ever envisaged (sometimes 40 years or more),[2] so any rules need to facilitate both initial development and subsequent maintenance and enhancement by people other than the original authors. In Ninety-ninety rule, Tim Cargill is credited with this explanation as to why programming projects often run late: "The first 90% of the code accounts for the first 10% of the development time. The remaining 10% of the code accounts for the other 90% of the development time." Any guidance which can redress this lack of foresight is worth considering. The size of a project or program has a significant effect on error rates, programmer productivity, and the amount of management needed.[3] Software quality[edit] Maintainability.Dependability.Efficiency.Usability. [edit]

Linux powers 91% of the worlds top supercomputers With the biannual list of the top 500 supercomputers the world over released, the following screenshot of a graph showing the operating systems used in those 500 peta-flop crunching machines, and produced by the University of California Berkeley, makes for an impressive visual glance at Linux’s dominance in Super Computing. Check it out in its full interactive form @ to get down and dirty with the stats behind it. Public Data Sets : Amazon Web Services A corpus of web crawl data composed of over 5 billion web pages. This data set is freely available on Amazon S3 and is released under the Common Crawl Terms of Use. Last Modified: Mar 17, 2014 17:51 PM GMT Three NASA NEX datasets are now available, including climate projections and satellite images of Earth. Last Modified: Nov 12, 2013 13:27 PM GMT The Ensembl project produces genome databases for human as well as over 50 other species, and makes this information freely available. Last Modified: Oct 8, 2013 14:38 PM GMT Last Modified: Oct 8, 2013 14:37 PM GMT Human Microbiome Project Data Set Last Modified: Sep 26, 2013 17:58 PM GMT The 1000 Genomes Project, initiated in 2008, is an international public-private consortium that aims to build the most detailed map of human genetic variation available. Last Modified: Jul 18, 2012 16:34 PM GMT Last Modified: Apr 24, 2012 21:18 PM GMT Last Modified: Mar 4, 2012 3:22 AM GMT Last Modified: Feb 15, 2012 2:22 AM GMT Last Modified: Jan 21, 2012 2:12 AM GMT

Free Programming and Computer Science Books The Best Linux Distros On this page you will find the best Linux distros for various purposes. We've taken the effort to categorize them and picked only those we believe to be the best ones and which will most likely be useful to you. One of the most popular general-use distributions with one of the largest selections of software. Based on: Debian The most cutting-edge general-use distribution on a 6-month release cycle. A stable, general-use distribution for everyone that excels in enterprise environments. openSUSE includes a few different defaults such as it's package manager and the KDE desktop environment. One of the most stable and best Linux distros in existence, with a large selection of software. A highly-customized distribution that includes many features out-of-the-box that other distributions do not, including codecs. A simple and elegant distribution which is entirely compatible with Ubuntu. A general-purpose distribution that is centered around machine-specific optimization.

Le metropoli delle mafie 67 Open Source Replacements for Really Expensive Applications Why spend thousands or even hundreds or thousands of dollars on a closed source application when you can get a comparable open source app for free? Even if you need commercial support, many open source programs now offer paid support that costs much less than the alternatives. For this list, we looked for quality, open source alternatives to software that has a reputation for being expensive. We published a similar list last year, and we've updated and expanded the list for 2011. Accounting 1. 2. 3. 4. Audio Recording/Editing 5. 6. 7. Business Intelligence 8. 9. 10. 11. 12. Business Process Management 13. 14. 15. Customer Relationship Management 16. Database 17. 18. 19.Kexi Replaces Microsoft Office Access 2010 ($139.95), FileMaker Pro 11 ($299) Calling itself “a long-awaited competitor for programs like MS Access or Filemaker,” KDE's Kexi offers a set of features similar to both applications.

30 Places to Find Open Data on the Web Finding an interesting data set and a story it tells can be the most difficult part of producing an infographic or data visualization. Data visualization is the end artifact, but it involves multiple steps – finding reliable data, getting the data in the right format, cleaning it up (an often underestimated step in the amount of time it takes!) and then finding the story you will eventually visualize. Following is a list useful resources for finding data. Your needs will vary from one project to another, but this list is a great place to start — and bookmark. 1. Data.gov: This is the go-to resource for government-related data. 2. These are the places that house data from all kinds of sources. 3. Usually, the best place to get social data for an API is the site itself: Instagram, GetGlue, Foursquare, pretty much all social media sites have their own API’s. 4. Wunderground has detailed weather information and also let’s you search historical data by zip code or city. 5. 6. 7.

Related: