Article 320M6 CodeSOD: You Absolutely Don't Need It

CodeSOD: You Absolutely Don't Need It

by
snoofle
from The Daily WTF on (#320M6)

The progenitor of this story prefers to be called Mr. Syntax, perhaps because of the sins his boss committed in the name of attempting to program a spreadsheet-loader so generic that it could handle any potential spreadsheet with any data arranged in any conceivable format.

The boss had this idea that everything should be dynamic, even things that should be relatively straightforward to do, such as doing a web-originated bulk load of data from a spreadsheet into the database. Although only two such spreadsheet formats were in use, the boss wrote it to handle ANY spreadsheet. As you might imagine, this spawned mountains of uncommented and undocumented code to keep things generic. Sin was tasked with locating and fixing the cause of a NullPointerException that should simply never have occurred. There was no stack dump. There were no logs. It was up to Sin to seek out and destroy the problem.

Just to make it interesting, this process was slow, so the web service would spawn a job that would email the user with the status of the job. Of course, if there was an error, there would inevitably be no email.

It took an entire day to find and then debug through this simple sheet-loader and the mountain of unrelated embedded code, just to find that the function convertExcelSheet blindly assumed that every cell would exist in all spreadsheets, regardless of potential format differences.

[OP: in the interest of brevity, I've omitted all of the methods outside the direct call-chain...]

 public class OperationsController extends BaseController { private final JobService jobService; @Inject public OperationsController(final JobService jobService) { this.jobService = jobService; } @RequestMapping(value = ".../bulk", method = RequestMethod.POST) public @ResponseBody SaveResponse bulkUpload(@AuthenticationPrincipal final User activeUser, @RequestParam("file") final MultipartFile file, final WebRequest web, final HttpServletRequest request){ SaveResponse response = new SaveResponse(); try { if (getSystemAdmin(activeUser)) { final Map<String,Object> customParams = new HashMap<>(); customParams.put(ThingBulkUpload.KEY_FILE,file.getInputStream()); customParams.put(ThingBulkUpload.KEY_SERVER_NAME,request.getServerName()); response = jobService.runJob((CustomUserDetails)activeUser,ThingBulkUpload.JOB_NAME, customParams); } else { response.setWorked(false); response.addError("ACCESS_ERROR","Only Administrators can run bulk upload"); } } catch (final Exception e) { logger.error("Unable to process file",e); } return response; } } @Service("jobService") @Transactional public class JobServiceImpl implements JobService { private static final Logger logger = LoggerFactory.getLogger(OperationsService.class); private final JobDAO jobDao; @Inject public JobServiceImpl(final JobDAO dao){ this.jobDao = dao; } public SaveResponse runJob(final @NotNull CustomUserDetails user, final @NotNull String jobName, final Map<String,Object> customParams) { SaveResponse response = new SaveResponse(); try { Job job = (Job) jobDao.findFirstByProperty("Job","name",jobName); if (job == null || job.getJobId() == null || job.getJobId() <= 0) { response.addError("Unable to find Job for name '"+jobName+"'"); response.setWorked(false); } else { JobInstance ji = new JobInstance(); ji.setCreatedBy(user.getUserId()); ji.setCreatedDate(Util.getCurrentTimestamp()); ji.setUpdatedBy(user.getUserId()); ji.setUpdatedDate(Util.getCurrentTimestamp()); ji.setJobStatus((JobStatus) jobDao.findFirstByProperty("JobStatus", "jobStatusId", JobStatus.KEY_INITIALZING) ); ji.setStartTime(Util.getCurrentTimestamp()); ji.setJob(job); Boolean created = jobDao.saveHibernateEntity(ji); if (created) { String className = job.getJobType().getJavaClass(); Class<?> c = Class.forName(className); Constructor<?> cons = c.getConstructor(JobDAO.class,CustomUserDetails.class,JobInstance.class,Map.class); BaseJobImpl baseJob = (BaseJobImpl) cons.newInstance(jobDao,user,ji,customParams); baseJob.start(); ji.setUpdatedDate(Util.getCurrentTimestamp()); ji.setJobStatus((JobStatus) jobDao.findFirstByProperty("JobStatus", "jobStatusId", JobStatus.KEY_IN_PROCESS) ); jobDao.updateHibernateEntity(ji); StringBuffer successMessage = new StringBuffer(); successMessage.append("Job '").append(jobName).append("' has been started. "); successMessage.append("An email will be sent to '").append(user.getUsername()).append("' when the job is complete. "); String url = baseJob.generateCheckBackURL(); successMessage.append("You can also check the detailed status here: <a href=\"").append(url).append("\">").append(url).append("</a>"); response.addInfo(successMessage.toString()); response.setWorked(true); } else { response.addError("Unable to create JobInstance for Job name '"+jobName+"'"); response.setWorked(false); } } } catch (Exception e) { String message = "Unable to runJob. Please contact support"; logger.error(message,e); response.addError(message); response.setWorked(false); } return response; } } public class ThingBulkUpload extends BaseJobImpl { public static final String JOB_NAME = "Thing Bulk Upload"; public static final String KEY_FILE = "file"; public ThingBulkUpload(final JobDAO jobDAO, final CustomUserDetails user, final JobInstance jobInstance, final Map<String,Object> customParams) { super(jobDAO,user,jobInstance,customParams); } @Override public void run() { SaveResponse response = new SaveResponse(); response.setWorked(false); try { final InputStream inputStream = (InputStream) getCustomParam(KEY_FILE); if(inputStream == null) { response.addError("Unable to run ThingBulkUpload; file is NULL"); } else { final AnotherThingImporter cri = new AnotherThingImporter(customParams); cri.changeFileStream(inputStream); response = cri.importThingData(user); } } catch (final Exception e) { final String message = "Unable to finish ThingBulkUpload"; logger.error(message,e); response.addError(message + ": " + e.getMessage()); } finally { finalizeJob(response); } }}public class AnotherThingImporter { // Op: Instantiated this way, even though the impls are annotated with Spring's @Repository. private final LocationDAO locationDAO = new LocationDAOImpl(); private final ContactDAO contactDAO = new ContactDAOImpl(); private final EntityDAO entityDAO = new EntityDAOImpl(); private final BaseHibernateDAO baseDAO = new BaseHibernateDAOImpl(); // Op: snip a few dozen more DAOs private InputStream workbookStream = null; private final Map<String, Object> customParams; public ClientRosterImporter(final Map<String, Object> customParams) { this.customParams = customParams; } public void changeFileStream(final InputStream fileStream) { workbookStream = fileStream; } public SaveResponse importThingData(final CustomUserDetails adminUser) { final SaveResponse response = new SaveResponse(); if (workbookStream == null) { throw new ThreeWonException("MISSING_FILE", "ClientRosterImporter was improperly created. No file found."); } try { final XSSFWorkbook workbook = new XSSFWorkbook(workbookStream); for (int i = 0; i < workbook.getNumberOfSheets(); i++) { final XSSFSheet sheet = workbook.getSheetAt(i); final String sheetName = sheet.getSheetName(); // Op: snip 16 unrelated else ifs... } else if (sheetName.equalsIgnoreCase("History")) { populateHistory(adminUser, response, sheet); } // Op: snip 3 more unrelated else ifs... } } catch (final IOException e) { throw new ThreeWonException("BAD_EXCEL_FILE", "Unable to open excel workbook."); } if (response.getErrors() == null || response.getErrors().size() <= 0) { response.setWorked(true); } return response; } // Op: snip 19 completely unrelated methods private void populateEducationHistory(final CustomUserDetails adminUser, final SaveResponse response, final XSSFSheet sheet) { final ThingDataConverter converter = new ThingDataConverterImpl(entityDAO, locationDAO, contactDAO); converter.convertExcelSheet(adminUser, response, sheet, customParams); }}public class ThingChildAssocConverter extends ThingDataConverter { public void convertExcelSheet(final CustomUserDetails adminUser, final SaveResponse response, final XSSFSheet sheet, final Map<String, Object> customParams) { initialize(customParams); final int rowCount = sheet.getPhysicalNumberOfRows(); Integer numCreated = 0; for (int rowIndex = DEFAULT_HEADER_ROW_COUNT; rowIndex < rowCount; rowIndex++) { final XSSFRow currentRow = sheet.getRow(rowIndex); ... // Op: Null pointer thrown from row.getCell(...) //final String name = df.formatCellValue(currentRow.getCell(COL_INST_NUM)); final String name = getValue(currentRow, COL_INST_NUM); ... // Op: creation of the record here } } protected String getValue(final XSSFRow row, final Integer column) {// Op: We can not assume that any given cell will exist on all spreadsheets try { return df.formatCellValue(row.getCell(column)).trim(); } catch (final Exception e) { // avoid NullPointers by returning "" instead of null return ""; } }} 

As opposed to two simple methods that just retrieved the cells, in order, from each specific spreadsheet format.

atlasoft-50x50.png [Advertisement] Atalasoft's imaging SDKs come with APIs & pre-built controls for web viewing, browser scanning, annotating, & OCR/barcode capture. Try it for 30 days with included support. TheDailyWtf?d=yIl2AUoC8zA6quHMfeVzCY
External Content
Source RSS or Atom Feed
Feed Location http://syndication.thedailywtf.com/TheDailyWtf
Feed Title The Daily WTF
Feed Link http://thedailywtf.com/
Reply 0 comments