@ECHO OFF/ON Command
The ECHO OFF/ON Command.
This command will either turn ON, or OFF the command you put in a batch file from showing itself.
http://www.instructables.com/id/Slightly-More-Advanced-Basic-Batch/step2/ECHO-OFFON-Command/
Sunday, September 29, 2013
Eclipse Memory Analyzer (MAT)
Memory Analyzer (MAT)
The Eclipse Memory Analyzer is a fast and feature-rich Java heap analyzer that helps you find memory leaks and reduce memory consumption.
Use the Memory Analyzer to analyze productive heap dumps with hundreds of millions of objects, quickly calculate the retained sizes of objects, see who is preventing the Garbage Collector from collecting objects, run a report to automatically extract leak suspects
http://www.eclipse.org/mat/
The Eclipse Memory Analyzer is a fast and feature-rich Java heap analyzer that helps you find memory leaks and reduce memory consumption.
Use the Memory Analyzer to analyze productive heap dumps with hundreds of millions of objects, quickly calculate the retained sizes of objects, see who is preventing the Garbage Collector from collecting objects, run a report to automatically extract leak suspects
http://www.eclipse.org/mat/
Labels:
eclipse
build pipeline
Builds were typically done straight from the developer’s IDE and manually deployed to one of our app servers.
We had a manual process in place, where the developer would do the following steps.
Check all project code into Subversion and tag
Build the application.
Archive the application binary to a network drive
Deploy to production
Update our deployment wiki with the date and version number of the app that was just deployed
when we needed to either rollback to the previous version, or branch from the tag to do a bugfix
http://java.dzone.com/articles/creating-build-pipeline-using
We had a manual process in place, where the developer would do the following steps.
Check all project code into Subversion and tag
Build the application.
Archive the application binary to a network drive
Deploy to production
Update our deployment wiki with the date and version number of the app that was just deployed
when we needed to either rollback to the previous version, or branch from the tag to do a bugfix
http://java.dzone.com/articles/creating-build-pipeline-using
Labels:
continuous integration
Saturday, September 28, 2013
Code Review Tools
- Before introducing code reviews in your project, agree on a procedure for how to do it, and make sure that everyone agrees with it and understands it. I can suggest the following procedure points, just to get started:
- Review Assistant tool includes the custom check-in policy for TFS
- setup a custom policy to TFS that every check-in needs to pass a code review
- FishEye
Plan for a maximum of four hours development time
Keep the master branch as the production ready branch at all times
Always branch out a new task from master or release
Use a standard for naming your task branches (more on this below)
Use task branches for every change (no exception!)
When done coding, push to
Do a code review of your own pull request before sending it
Never merge to master until you get at least one "Ok" comment from others
Check others pull requests at least every two hours
http://share.ez.no/blogs/arne-bakkebo/branching-and-code-review-procedures-qa-4
Team Foundation Server provides
https://www.devart.com/review-assistant/docs/index.html?adding_code_review_policy_to_tfs_project.html
So, instead of developers having to shelve their changes manually and assign the
http://blog.devart.com/creating-tfs-custom-check-in-policy-part-1.html
FishEye knows everything about your code: search
Keep a pulse on everything about your code:
Visualize and report on activity, integrate source with JIRA issues, and search for commits, files, revisions, or people.
http://www.atlassian.com/software/fisheye/overview
- stash
https://www.atlassian.com/software/stash
- Gerrit
Gerrit is a free, web-based team software code review tool.
Software developers in a team can review each other's modifications on their source code using a Web browser and approve or reject those changes.
It integrates closely with git, a distributed version control system.
https://en.wikipedia.org/wiki/Gerrit_(software)
Gerrit
Gerrit is a
https://code.google.com/p/gerrit/
- A code review process is a process in which
a change by a developer by other developers.is reviewed
Every developer can suggest changes and update the suggested changes.
Once the change
While
Gerrit is a code review system developed for the Git version control system
Advantages of code review
Early error detection: errors are early identified in the process
Conformity with coding standards: code review allows the team to identify early in the process any violations with the code standards of the team
Knowledge exchange: the code review process allows new developers to see the code of other developers and to get early feedback on their suggested changes
Shared code ownership: by reviewing
Code reviews in open tools provide people without the permission to push to a repository with a simple way to contribute their suggested changes and to get feedback.
http://www.vogella.com/articles/Gerrit/article.html
- Before you can commit the files to your repository, you need to add them.
Simply right click the shared project’s node and navigate to Team =>Add
After this operation, the question mark should change to a plus symbol. To set certain folders or files to
After changing files in your project, a “>” sign will appear right after the icon, telling you the status of these files is dirty
http://eclipsesource.com/blogs/tutorials/egit-tutorial/
- Git blame
git status git tag git blame
The git blame command is a versatile troubleshooting utility that has extensive usage options. The high-level function of git blame is the display of author metadata attached to specific committed lines in a file. This is used to examine specific points of a file's history and get context as to who the last author was that modified the line. This is used to explore the history of specific code and answer questions about what, how, and why the code was added to a repository.
Git Blame vs Git Log
While git blame displays the last author that
The git blame command is used to examine the contents of a file line by line and see when each line was last
Online Git hosting solutions like Bitbucket offer blame views, which offer a superior user experience to command line git blame usage.
https://www.atlassian.com/git/tutorials/inspecting-a-repository/git-blame
Labels:
DevOps
Build Tools
- Apache Maven is a software project management and comprehension
tool . Based on the concept of a project object model (POM), Maven can manage a project's build, reporting and documentation from a central piece of information.
- Ant
Apache Ant is a Java library and command-line tool whose mission is to drive processes described in build files as targets and extension points dependent upon each other. The main known usage of Ant is the build of Java applications
http://ant.apache.org/
Gradle
http://www.gradle.org/
- gulp is a toolkit for automating painful or time-consuming tasks in your development workflow, so you can stop messing around and build something.
https://gulpjs.com/
- SBT
The interactive build tool
Define your tasks in Scala. Run them in parallel from
https://www.scala-sbt.org/index.html
- Grunt
The Grunt ecosystem is huge and it's growing every day. With literally hundreds of plugins to choose from, you can use Grunt to automate just about anything with a minimum of effort. If someone hasn't already built what you need, authoring and publishing your own Grunt plugin to npm is a breeze.
Why use a task runner?In one word: automation. The less work you have to do when performing repetitive tasks like minification, compilation, unit testing,
http://gruntjs.com/
- Apache Ivy
The agile dependency manager
Apache Ivy is a popular dependency manager focusing on flexibility and simplicity
https://ant.apache.org/ivy/
- Java jar dependency management using Ivy
Along with our source code, we’ve also been putting all required jars into cvs , basically for two reasons:
The build process controlled by cruisecontrol needs the jars, but cruisecontrol runs on a server where we don’t want to install an IDE (jdeveloper ). We’ve put all adf , bc4j and other required jars into cvs , and when the build process compiles the application, it first gets all the required source code and the required jars, so no need for jdeveloper .
We want version control on dependencies. If you check out the code for release 1, you need the jars that release 1 depends on. If you check out release 2, you will need different jars.
So just relying on the project dependencies in jdeveloper is not an option.
The downside to this approach however, is that we get a lot of jars in cvs , and many projects use the same jar files, so it’s bit of a waste. For some time, i thought the solution would be to move from using ant to using maven for building the code. However, some weeks ago, i discovered ivy. Ivy is a java dependency manager, which you can easily use in existing projects using ant.
http://www.andrejkoelewijn.com/blog/2005/07/14/java-jar-dependency-management-using-ivy/
PHing
it's a PHP project build system or build tool based on Apache Ant.
You can do anything with it that you could do with a traditional build system like GNU make, and its use of simple XML build files and extensible PHP "task" classes make it an easy-to-use and highly flexible build framework .
http://www.phing.info/
- Gant
Build software with Gant - IBM
Gant is a highly versatile build framework that leverages both Groovy and Apache Ant to let you implement programmatic logic while using all of
www
LuntBuild
Continuous Integration or nightly build can
Buildbot Basics
https://buildbot.net/
CMake is a cross-platform open-source meta-build system which can build, test and package software.It can be used to support multiple native build environments including make, Apple’sxcode and Microsoft Visual Studio.
https://github.com/ttroy50/cmake-examples
CMake is an open-source, cross-platform family of tools designed to build, test and package software.CMake is used to control the software compilation process using simple platform and compiler independent configurationfiles, and generate nativemakefiles and workspaces that canbe used in the compiler environment of your choice.The suite of CMake tools were created by Kitware in response to the need for a powerful, cross-platform build environment for open-source projects such as ITK and VTK
https://cmake.org/
- In a short conclusion,
CMake help youto manage and build your source codes effectively. If you have some troubles withgcc andMakefile , just move out toCMake .
https://tuannguyen68.gitbooks.io/learning-cmake-a-beginner-s-guide/content/chap1/chap1.html
- First, it is almost unreadable. Second, it does not account for any header file dependencies. In fact, if any of the header files change, there’s no way for the system to detect this at all, forcing the user to type
in make clean, then make to make everything all over again. Third, given the sheer number of files, make will take minutes to run, which kills productivity. Fourth, if you look closely at the path, youwill observe that the person writing this Makefile used a Mac withMacPorts and therefore is thekind of person who is just wrong. Fifth, this Makefileobviously won’t work for someone on Linux or Windows, even worse, it won’t work for another Mac user who used HomeBrew or compiled OpenCV from source. Sixth, some rules are just commented out, whichindicates sloppy behaviour. WritingMakefiles like this just sucks. It sucks productivity and makes me think of slow and painful death.
I’ll now show how I replaced this terrible Makefile with a clean, modular
the ease of setting up this
As these files
But what if I’m on Windows, or if I just am the
https://skandhurkat.com/post/intro-to-cmake/
- I love
CMake, it allows me to write cross-platform code and be confident that the build system would work across a choice of compilers, IDEs, and operating systems.
For example, I could
when C++ with
https://skandhurkat.com/post/intro-to-ctest/
SCons is an Open Source software construction tool—that is, a next-generation build tool. Think ofSCons as an improved, cross-platform substitute for the classic Make utility withintegrated functionality similar toautoconf /automake and compiler caches such asccache .In short, SCons is an easier, more reliable and faster way to build software.
https://scons.org/
- Yarn is a package manager for your code. It allows you to use and share code with other developers from around the world. Yarn does this quickly, securely, and reliably so you
don’t ever have to worry.
https://yarnpkg.com/en/
Labels:
build automation,
DevOps
maven repository management
- Nexus
Download Nexus and gain control over open source consumption and internal collaboration.
http://www.sonatype.org/nexus/
Artifactory
Proxy
Local Cache
It
Control
It blocks unwanted (and sometimes security-sensitive) external requests for internal artifacts and controls how and where artifacts
http://www.jfrog.com/home/v_artifactory_opensource_overview
- Empower Hudson with
Artifactory - Track and Replay Your Build Artifacts
Using one of the different flavors of version control applications, you can easily reproduce the state of any point in the past using the different methods of SCM tagging.
what happens when you want to reproduce binary products from a certain phase?
Are dependencies considered?
Does anyone really remember
What if you used version ranges or dynamic properties?
Was the application compiled using JDK 5 or 6?
Your CI server has all the knowledge required
Information on the builds themselves
The published items
Version information
Dependencies
Build environment details
Using Hudson (and others to
Supplied Hudson with all the needed dependencies from
Deployed all produced binaries to
Published build information to
With the
http://java.dzone.com/articles/empower-hudson-artifactory
Archiva
Apache
http://archiva.apache.org/
- In a continuous integration environment,
where buildsare often triggered by checking in artifacts, there is the potential fora large number of builds tobe executed . Each of these builds, at least the successful ones, results in some artifacts being published into the repository. These can start consuming a lot of space, and it is important to manage them.
Repository Purge by Number of Days Older
Repository Purge by Retention Count
Both
Repository Purge by Number of Days Older
Repository Purge By Retention Count
To use this method, you must set the purge-by-days-older value to 0.
http://docs.oracle.com/middleware/1212/core/MAVEN/populating_archiva.htm
- Maven Repository Manager,
Archiva in this case, includesthe following :
Internal:
Snapshot:
Mirror:
Dev, test,
http://docs.oracle.com/middleware/1212/core/MAVEN/intro_ref_ede.htm
- We use Maven 2 to resolve dependencies and build the source code into packages.
With the
Hudson will be the tool to start up the build process every day and notify the developers when the build fails.
By default you'll have 2 Managed repositories. An internal one you can use as a proxy
for the company, and a snapshots repository to put your snapshot builds
Besides those there are
also remote repositories. These are other Maven repositories where Archiva will look for
dependencies when they aren't in your own repositories.
You can also add remote repositories. This is useful if you want to set up a proxy repository for
your company. This way your company users only have to access the internal repository and
don't need to go to the internet.
You can let Hudson poll the SCM and rebuild after every commit. You can also let it build periodical by specifying a cron job
archiva starts up with two hosted repositories configured:
Internal
The internal repository is for maintaining fixed-version released artifacts deployed by your organization, which includes finished versions of artifacts, versions that are no longer in development, and released versions. Note that in-development versions of the same artifacts may exist in the snapshot repository in the future.
Snapshot
The snapshot repository holds the work-in-progress artifacts, which are denoted with a version with the suffix SNAPSHOT, and artifacts that have not yet been released.
http://docs.oracle.com/middleware/1212/core/MAVEN/populating_archiva.htm
- Efficiency. Repository acts as a cache for Maven Central artifacts
Resilience. Repository protects against remote repository failures or lack of internet connection
Repeatability. Storing common artifacts centrally, avoids shared build failures caused by developers maintaining their own local repositories.
Audit. If all 3rd party libraries used by development come from a single entry point in the build process one can assess how often they're used (based on download log files) and what kinds of licensing conditions apply.
http://stackoverflow.com/questions/8259118/good-configuration-for-archiva
- By default, Archiva comes with a proxy to the mavenSW central repository. Therefore, it's basically all set up to be used as a mirror of the maven central repository. If we make a request of Archiva for an artifact for a central repository artifact, it will download that artifact and return that artifact to us. If we make future requests for that artifact, Archiva will return to us the copy of the artifact that it had already downloaded from the central repository.
http://www.avajava.com/tutorials/lessons/how-do-i-use-archiva-as-a-mirror-of-the-maven-central-repository.html
Labels:
build automation,
DevOps
Subscribe to:
Posts (Atom)